Feeds:
Posts
Comments

Posts Tagged ‘PISA’

With the release of the 2015 Program for International Student Assessment (PISA) on the horizon,  the Organization for Economic Cooperation and Development (OECD) Education Office has stoked-up the “Math Wars” with a new study. While the October 2016 report examines a number of key questions related to teaching Mathematics, OECD Education chose to highlight its findings on “memorization,” presumably to dispel perceptions about “classroom drill” and its use in various countries.

mathsubtractionboardThe OECD, which administers the PISA assessments every three years to 15-year-olds from around the globe, periodically publishes reports looking at slices of the data. It’s most October 2016 report,  Ten Questions for Mathematics Teachers and How PISA Can Help Answer Them, based upon the most recent 2012 results, tends to zero-in on “memorization” and attempts to show that high-performing territories, like Shanghai-China, Korea, and Chinese-Taipei, rely less on memory work than lower-performing places like Ireland, the UK, and Australia.

American Mathematics educator Jo Boaler, renowned for “Creative Math,” jumped upon the PISA Study to buttress her case  against “memorization” in elementary classrooms. In a highly contentious November 2016 Scientific American article, Boaler and co-author Pablo Zoido, contended that PISA findings confirmed that “memorizers turned out to be the lowest achievers, and countries with high numbers of them—the U.S. was in the top third—also had the highest proportion of teens doing poorly on the PISA math assessment.” Students who relied on memorization, they further argued, were “approximately half a year behind students who used relational and self-monitoring strategies” such as those in Japan and France. 

Australian education researcher Greg Ashman took a closer look at the PISA Study and called into question such hasty interpretations of the findings.  Figure 1.2: How teachers teach and students learn caught his eye and he went to work interrogating the survey responses on “memorization” and the axes used to present the data.  The PISA analysis, he discovered, also did not include an assessment of how teaching methods might be correlated with PISA scores in Mathematics.  Manitoba Mathematics professor Robert Craigen spotted a giant hole in the PISA analysis and noted that the “memorization” data related to “at-home strategies of students” not their instructional experiences and may wel;l indicate that students who are improperly instructed in class resort to memorization on their own.

mathpisateacherdirectedgraphWhat would it look like, Ashman wondered, if the PISA report had plotted how students performed in relation to the preferred methods used on the continuum from “more student-oriented instruction” to “more teacher-directed instruction.” Breaking down all the data, he generated a new graph that actually showed how teaching method correlated with higher math performance and found a “positive correlation” between teacher-directed instruction and higher Math scores. “Correlations,” he duly noted, “do not necessarily imply causal relationships but clearly a higher ratio of teacher-directed activity to student orientation.”

Jumping on the latest research to seek justification for her own “meta-beliefs” are normal practice for Boaler and her “Discovery Math” education disciples. After junking, once again, the ‘strawmen’ of traditional Mathematics — “rote memorization” and “drill,” Boaler and Zoido wax philosophical and poetic: “If American classrooms begin to present the subject as one of open, visual, creative inquiry, accompanied by growth-mindset messages, more students will engage with math’s real beauty. PISA scores would rise, and, more important, our society could better tap the unlimited mathematical potential of our children.” That’s definitely stretching the evidence far beyond the breaking point.

The “Math Wars” do generate what University of Virginia psychologist Daniel T. Willingham has aptly described as “a fair amount of caricature.” The recent Boaler-Zoido Scientific American article is a prime example of that tendency. Most serious scholars of cognition tend to support the common ground position that learning mathematics requires three distinct types of knowledge: factual, procedural and conceptual. “Factual knowledge,” Willingham points out, “includes having already in memory the answers to a small set of problems of addition, subtraction, multiplication, and division.” While some students can learn Mathematics through invented strategies, it cannot be relied upon for all children. On the other hand, knowledge of procedures is no guarantee of conceptual understanding, particularly when it comes to complexites such as dividing fractions. It’s clear to most sensible observers that knowing math facts, procedures and concepts is  what counts when it comes to mastering mathematics.

mathtimestableimageSimply ignoring research that contradicts your ‘meta-beliefs’ is common on the Math Education battlefield. Recent academic research on “memorization” that contradicts Boaler and her entourage, is simply ignored, even that emanating from her own university. Two years ago, Shaozheng Qin and Vinod Menon of Stanford University Medical School led a team that provided scientifically-validated evidence that “rote memorization” plays a critical role in building capacity to solve complex calculations.

Based upon a clinical study of 68 children, aged 7 to 9, studied over the course of one year, their 2014 Nature Neuroscience study, Qin, Menon et al. found that memorizing the answers to simple math problems, such as basic addition or multiplication, forms a key step in a child’s cognitive development, helping bridge the gap between counting on fingers and tackling more complex calculations. Memorizing the basics, they concluded, is the gateway to activating the “hippocampus,” a key brain structure for memory, which gradually expands in “overlapping waves” to accommodate the greater demands of more complex math.

The whole debate over memorization is suspect because of the imprecision in the use of the term. Practice, drilling, and memorization are not the same, even though they get conflated in Jo Boaler’s work and in much of the current Mathematics Education literature. Back in July 2012, D.T. Willingham made this crucial point and provided some valuable points of distinction. “Practice,” as defined by Anders Ericsson, involves performing tasks and feedback on that performance, executed for the purpose of improvement. “Drilling’ connotes repetition for the purpose of achieving automaticity, which – at its worst, amounts to mindless repetition or parroting. “Memorization,” on the other hand, relates to the goal of something ending up in long-term memory with ready access, but does not imply using any particular method to achieve that goal.

Memorization has become a dirty word in teaching and learning laden with so much baggage to the point where it conjures up mental pictures of “drill and kill” in the classroom. The 2016 PISA Study appears to perpetuate such stereotyping and, worst of all, completely misses the “positive correlation” between teacher-directed or explicit instruction and better performance in mathematics.

Why does the PISA Study tend to associate memorization in home-study settings with the drudgery of drill in the classroom?  To what extent does the PISA Study on Mathematics Teaching support the claims made by Jo Boaler and her ‘Discovery Math’ advocates? When it comes to assessing the most effective teaching methods, why did the PISA researchers essentially take a pass? 

 

Advertisements

Read Full Post »

The Chinese city of Shanghai has a school system that produces students who soar far above the rest.  On the 2009 and 2012 Programme of International Student Achievement (PISA) tests, administered to 15-year-olds worldwide, Shanghai-China students ranked first in all three major domains –mathematics, reading and science.

ShanghaiTeacherUntil recently, the secret of that astounding student achievement success was essentially shrouded in mystery.  With the release of the May 17 World Bank study, “How Shanghai Does It,” the answers are beginning to emerge, providing vitally-important lessons for education policy-makers in Canadian school systems and far beyond.

The World Bank report on Shanghai education, issued by World Bank research director Harry Patrinos, provides a counterpoint to the prevailing narrative that North American school systems should look to Finland for lessons on school improvement. It demonstrates, in incredible detail, what lies behind Shanghai-China’s rise to ‘education super-nova.’

The report, based upon SABER, a comprehensive World Bank system for benchmarking school system performance, delves deeply into how and why Shanghai students achieve excellent learning results. In the process, it smashes a few stubborn stereotypes and dispels the image of a mechanistic, test-driven, joyless educational enterprise.

Shanghai’s student successes stem, according to the World Bank, from a focus on teaching excellence. What’s unique about Shanghai-China is the way it “grooms, supports, and manages” teachers to raise educational quality and to a culture which accords great respect to the “teaching profession.”

We know that Shanghai students break records for extraordinary test scores, but lesser known is the success achieved in raising the floor for overall student achievement. The city has the highest share of disadvantaged students in the top 25 per cent range on PISA tests, and that is no accident. Educational equity is becoming a higher priority, especially targeting children of migrants.

Teachers in Shanghai are, by all accounts, well-trained and mentored after they become licensed to teach in schools. Ongoing professional development is not only offered, as in Canada, but integrated into a “collegial and supportive” professional growth process.  Subject mastery and pedagogical training go together in developing skilled and accomplished teachers.

Teaching time is organized far differently than in Canadian schools.  The Chinese teachers spend only one-third of their time actually teaching and far more emphasis is placed on preparation of demonstration lessons. Teaching effectiveness is the clear priority, not scattered efforts spread across a range of classes.

Teaching is also rewarded far differently.  Instead of being paid on a lock-step grid based upon seniority, Shanghai teachers move up the ladder based upon merit and guided by principals who are trained as instructional leaders not building administrators.

The biggest surprise is how Shanghai’s school system works to reduce educational inequalities. While education funding is vested in the school district, a proportion of the ‘education tax’ is specifically allocated to poor and low performing school districts.

ShanghaiSchoolBBCOne educational innovation worth emulating is what is known as the “entrusted school” management model to help raise up underperforming schools.  High-performing Shanghai schools are “twinned” with struggling schools within the state system. Instead of establishing private schools or creating charters, the Chinese use “twinning” to extend management, training, and resource support to teachers and students in the struggling schools.

Since 2006, the world of education has been enraptured with the so-called “Finnish Miracle,” while Shanghai-China has surged far ahead in student achievement. Instead of hitching our school improvement wagon to Finnish education promoter extraordinaire Pasi Sahlberg and his Finnish lessons, we should be looking at best practice anywhere and everywhere.

Let’s start by finding out where exactly we rank and what might be the areas that need improvement.  We generate lots of national, provincial and international student performance data, so why not put it to better use?

A really bold initiative would be to invite the World Bank to assess one Canadian provincial school system in relation to the SABER benchmarks.  The State of Maryland in the United States has already done so, and the SABER report for Maryland demonstrates just how incredibly valuable it can be in planning for, and advancing, school improvement.

The Finnish Education Miracle has begun to lose its lustre. Perhaps it’s time to consider edutourism junkets to Shanghai instead of Helsinki – in search of educational excellence as well as innovative teaching-learning ideas.

*An earlier version of this Commentary appeared in The Telegraph-Journal, provincial edition, based in Saint John, NB.

Will the World Bank report on Shanghai’s Educational Success be a wake-up call for North American educational leaders? Do popular stereotypes about Chinese education obscure our vision of Shanghai’s remarkable student performance achievements? Should we be producing more detailed studies of “Shanghai Lessons” for educators? And which Canadian province will be the first to follow Maryland in stepping-up to participate in the SABER assessment of school system effectiveness? 

 

Read Full Post »

Two retired Ontario educators, Dr. Denis Mildon and Gilles Fournier, have now surfaced in an attempt to preserve and protect the educational investment legacy of the Dalton McGuinty Liberal reform agenda (2003-13). In a Toronto Star opinion column (July 6, 2015), they repeat the familiar claim that Ontario’s system is “considered one of the finest in the world.”

EducationPremier

Ontario’s educational supremacy is presented, as usual, as a statement of incontestable fact. “Though sound research, innovation and policy development Ontario’s system, ” Mildon and Fournier contend, “has become a model of equity and inclusiveness in education and, as a result, in student achievement.”

Ontario education under McGuinty was certainly among the best resourced systems in the world. With OISE school change theorists Michael Fullan and Ben Levin championing increased system-wide investment, spending skyrocketed by over 57% from 2003 to 2011 to $22 billion while school enrollment fell by some 6 per cent. Public funding poured in to support a series of Poverty Reduction initiatives, enhanced special program supports, universal full day Kindergarten, and even Parents Reaching Out (PRO) Grants for parent education.

The origin, of course, of the now infamous “Best System” claim is the two McKinsey and Company reports (2007 and 2010) purporting to identify and then analyze the success of twenty of the world’s leading education systems. It also echoes the very wording used by the Ontario education reform architect Michael Fullan in a high profile  2012 Atlantic article assessing the success of his own initiatives. Aside from Fullan’s 2010 report forward, there is surprisingly little about Ontario initiatives in the actual report, except for one passing reference to PRO grants.

Repeating such claims,referencing the reform advocates themselves,is wearing mighty thin as fresh evidence accumulates that closing the education equality gap does not necessarily translate into improved student achievement. Even more telling, much of the McGuinty era funding-driven “progress” was fueled by increases in spending that are simply unsustainable.

Outsized claims of educational excellence based upon the McKinsey & Company reports are now highly problematic. British researcher Frank Coffield’s 2012 critique of the reports, published in the Journal of Education Policy, has shredded the research and raised serious questions about the reports’ credibility.  Alarmed that the report’s analysis and prescriptions have “hardened into articles of faith” among politicians and policy makers, he argues that the McKinsey-Fullan system-wide reform agenda will “not improve school systems.”

MichaelFullanMuch of Coffield’s critique of McKinsey-style reform applies to Ontario, the Canadian province where Fullan field-tested his school change theories from 2003 to 2013. Centralized reform initiatives, like Fullan’s, he shows, reflect “an impovershed view” of the state of teaching and learning, favouring professionalization over school-level initiatives.

Coffield is particularly skeptical about the legitimacy of the whole assessment. Claims of student success by McKinsey and Fullan are problematic because of the “weak evidence base” and suspect claims about “educational leadership” that “outrun the evidence” in the reports. He’s also troubled by the McKinsey-Fullan language which sounds “technocratic and authoritarian.”  Cultural and socio-ethnic differences are also “underplayed” in such systems-thinking and there is little or no recognition of the role democratic forces play in the public education domain.

One of the few Canadian educators to raise flags about the McKinsey-Fullan ideology was former Peel Catholic Board teacher Stephen Hurley. Writing in March 2011 on the CEA Blog, he expressed concern over the report’s basic assumptions – that teachers come with “low skills” and that centralized approaches are best at fostering professional growth.

Hurley pinpointed two critical weaknesses of the McKinsey-Fullan reform agenda. “As we move forward, how do we give back to our teachers that professional space to develop a strong sense of purpose and efficacy?  How do we as teachers work to reclaim our identities as highly trained and highly competent professionals?”

Two years after McGuinty’s fall from grace, serious questions are being asked about whether the lavish education spending actually produced better results. Staking the claim on rising graduation rates is suspect because, while the graduation rate rose from 68 to 83 per cent, we know that “attainment levels” do not usually reflect higher achievement levels, especially when more objective performance measures, such as student Math scores,stagnated during those years.

Upon closer scrutiny, the Mildon and Fournier commentary is not about protecting student achievement gains at all. Defending current time-consuming evaluation practices, smaller class sizes, preparation time, banking of sick days, ready access to sub teachers, and current curriculum approaches sounds far more like a teacher-driven agenda for Ontario schools. Wrapping Ontario education in that “world leading school system” banner, does not have the appeal or resonance it once had now that parents and the public have a better read on the actual results of that rather high-cost reform agenda.

What did the Dalton McGuinty Education Reform agenda actually achieve in terms of improving student progress and achievement? Where are the independent assessments of McGuinty education reforms supported by serious professionally validated research? Will the Education Reform global “success” story turn out to be essentially a carefully constructed, nicely-packaged mirage?

Read Full Post »

Today the Organization for Economic Development and Cooperation (OECD) has succeeded in establishing the Program of International Student Assessment (PISA) test and national rankings as the “gold standard” in international education. Once every three years since 2000, PISA provides us with a global benchmark of where students 15 years of age rank in three core competencies — reading, mathematics, and science. Since its inception, United States educators have never been enamoured with international testing, in large part because American students rarely fare very well.

PISATestVisualSo, when the infamous OECD PISA Letter was published in early May 2014 in The Guardian and later The Washington Post, the academics and activists listed among the initial signatory list contained the names of some familiar American anti-testing crusaders, such as Heintz-Deiter Meyer (SUNY, Albany), David Berliner (Arizona State University), Mark Naison (BAT, Fordham University), Noam Chomsky (MIT) and Alfie Kohn, the irrepressible education gadfly. That letter, addressed to Andreas Schleicher, OECD, Paris, registered serious concerns about “the negative consequences of the PISA rankings” and appealed for a one cycle (three-year) delay in the further implementation of the tests.

The global campaign to discredit PISA earned a stiff rebuke in Canada. On June 11 and June 18, 2014, the C.D. Howe Institute released two short commentaries demonstrating the significant value of PISA test results and effectively countering the appeal of the anti-PISA Letter. Written by Education Fellow John Richards the two-part report highlighted the “Bad News” in Canada’s PISA Results and then proceeded to identify What Works (specific lessons to be learned) based upon an in-depth analysis of the once every three-year tests. In clear, understandable language, Richards identified four key findings to guide policies formulated to “put Canadian students back on track.”

The call for a pause in the PISA tests was clearly an attempt to derail the whole international movement to establish benchmarks of student performance and some standard of accountability for student achievement levels in over 60 countries around the world. It was mainly driven by American anti-testers, but the two Canadian-based signatories were radical, anti-colonialist academics, Henry Giroux (English and Cultural Studies, McMaster University) and Arlo Kempf ( Visiting Professor, Program Coordinator, School and Society, OISE).

Leading Canadian educationists like Dr. Paul Cappon (former CEO, Council on Learning) and even School Change guru Michael Fullan remain supporters of comparative international student assessments. That explains why no one of any real standing or clout from Canada was among the initial group, and, by late June, only 32 Canadian educationists could be found among the 1988 signatories from all over the globe. Most of the home-grown signatories were well known educators in what might be termed the “accountability-free” camp, many like E. Wayne Ross (UBC) and Marc Spooner (U Regina), fierce opponents of “neo-liberalism” and its supposed handmaiden, student testing.

John Richards’ recent C.D.Howe commentaries should, at least temporarily, silence the vocal band of Canadian anti-testers.  His first commentary made very effective use of PISA student results to bore deeply into our key strengths and issues of concern, province-by-province, focusing particularly on student competencies in mathematics. That comparative analysis is fair, judicious, and research-based in sharp contrast to the honey-coated PISA studies regularly offered up by the Council of Ministers of Education (Canada).

The PISA results tell the story. While he finds Canadian students overall “doing reasonably well,”  the main concern is statistical declines in all provinces in at least one subject, usually either mathematics or reading.  Quebec leads in Mathematics, but in no other subject.  Two provinces (PEI and Manitoba) experienced significant declines in all three subject areas. Performance levels have sharply declined ) over 30 points) in mathematics in both Manitoba and Canada’s former leader, Alberta. Such results are not a ringing endorsement of the Mathematics curriculum based upon the Western and Northern Canada Protocol (WNCP). 

The warning signs are, by now, well known, but the real value in Richards’ PISA Results analysis lies in his very precise explanation of the actual lessons to be learned by educators.  What really matters, based upon PISA results, are public access to early learning programs, posting of school-level student achievement results, paying professional level teacher salaries, and the competition provided by achievement-oriented private and  independent (not for profit) schools. Most significantly, his analysis confirms that smaller class sizes (below 20 pupils per class) and increasing mathematics teaching time have a negligible effect on student performance results.

The C.D. Howe PISA Results analysis hit home with The Globe and Mail, drawing a favourable editorial, but was predictably ignored by the established gatekeepers of Canada’s provincial education systems. Why the reluctance to confront such research-based, common sense findings?  “Outing” the chronic under-performance of students from certain provinces ( PEI, Manitoba, New Brunswick, and Nova Scotia) is taboo, particularly inside the tight CMEC community and within the self-referenced Canadian Education Association (CEA) circles.  For the current Chair of CMEC, Alberta Education Minister Jeff Johnson any public talk of Alberta’s precipitous decline in Mathematics is an anathema.

Stung by the PISA warning shots, Canada’s provincial education gatekeepers tend to be less receptive to sound, research-based, practical policy correctives. That is a shame because the John Richards reports demonstrate that both “sides” in the ongoing  Education War are half-right and by mixing and matching we could fashion a much more viable, sustainable, effective policy agenda. Let’s tear up the existing and tiresome Neo-Con vs. Anti-Testing formulas — and re-frame education reform around what works – broader access to early learning, open accountability for student performance levels, paying respectable, professional-level teacher salaries, and welcoming useful competition from performance-driven private and independent schools.

What’s the  recent American Public Noise over “PISAfication” all about anyway?  Why do so many North American educators still tend to dismiss the PISA Test and the sound, research-based studies stemming from the international testing movement?  To what extent do John Richards’ recent C.D. Howe Institute studies suggest the need for a total realignment of provincial education reform initiatives?

 

 

Read Full Post »

The ‘Big Test’ has hit us and rocked our education world. The sliding math scores of Canadian 15 year-olds outside Quebec have just captured all the headlines and a series of PISA news stories and commentaries identified the “discovery learning” approach to teaching mathematics as the source of the recent, and continuing decline. Columnist Konrad Yakabuski , a close observer of the American education wars, saw the declining math scores as a “damaging legacy” of discovery learning. We are falling backward, he claimed, in both excellence and equity raising the fundamental question – “Has the education elite learned its lesson?”

PISAMathKidsIn the OECD’s 2012 Programme for International Assessment (PISA) rankings released December 3, 2013, , Canada  dropped out of the top 10 in student mathematics scores, a decline that raised alarms about the country’s future prosperity. Canadian students placed 13th overall in mathematics, down three spots from 2009 and six spots from 2006, in the highly anticipated test conducted every three years and which measures how 15-year-olds around the world are doing in math, reading and science. Canada ranked behind many Asian economies, including Shanghai (China), Singapore, Korea and Japan, while the United States lagged far behind and 36th out of 65 participating countries.

PISA12RankingsThe PISA test jolt comes on the heels of declining math scores nationally and a surprisingly poor showing from youth on a recent OECD literacy and numeracy test. The Canadian math curriculum, ushered in over the past decade, catching the blame for lower scores for good reason.  Curricula like the Western and Northern Canada Protocol (WNCP) is out-of-sync with high performing Asian countries because it  places far more emphasis on real-world concepts  than on abstract thinking, standard algorithms, and practice. The accompanying OECD report, in fact,  noted that the top performers had more exposure to formal mathematics than word problems. That may explain why Shanghai students topped the rankings and performed three grade levels above those of most other nations.

Topping the PISA student performance rankings attracts international acclaim, school system imitators, and increasingly scarce public education dollars. Once reviled by Canadian anti-testing advocates, the PISA test results are –oddly enough –what provides the ammunition for much of what now passes for informed debate over quality, equity, and accountability in Canada’s provincial school systems. They also bred a certain Canadian complacency until the recent release of the 2012 student results.

National and provincial reputations now ride on the PISA results. From 2000 to 2006, the PISA test results catapulted Finland’s education system to star status, and that ‘Finnish infatuation,’ essentially swept the Canadian educational establishment off its feet, blinding us to the Quebec’s success in mathematics and Ontario’s progress in improving reading and closing the socio-economic education gap.

Between 2000 and 2009, Canada plateaued in overall student performance and Canadian students posted a 10 per cent decline in reading scores. This week’s PISA results confirm that 15-year-old Canadian students, with the execution of those in Quebec, are losing ground, particularly in mathematics.

The rise and fall of Alberta, Canada’s former top performing province, contains a few valuable lessons. Two decades ago, Alberta was the first province to really confront the global learning gap, forecasting that, if trends continued, Albertan and Canadian students were going to be left behind.  

Dr. Joe Freedman, a Red Deer radiologist, and Andrew Nikiforuk, a Calgary-based Globe and Mail columnist, raised the first alarm bells and founded Albertans for Quality Education.  In 1991, they convinced the Alberta Chamber of Resources (ACR) and the Conference Board of Canada to produce a truly ground-breaking study,  International Comparisons in Education, comparing Alberta math and science curriculum with that in Japan, Germany and Hungary.

Alberta’s mathematics and science curriculum was then virtually re-written and bench-marked against that of the top performing nations. Under Education Minister Jim Dinning, the province built its rock solid reputation on raising standards, student testing, school choice and charter schools.

While Alberta ranked first on the PISA tests and topped the Pan-Canadian Assessment Programme (PCAP) tests in literacy and science for most of two decades, it has slipped precipitously since 2006. Adopting the WNCP math curriculum with its “discovery learning” focus and the Finnish infatuation have been key factors in the decline.

The ‘Finnish solution’ began to lose its lustre after the 2009 PISA test when Finland saw its reading scores drop by 11 per cent. Outside of Canada, education policy analysts have now become far more enamoured with Asian school systems like Shanghai and Korea.

None of this seems to matter to Canadian ‘progressives,’ sponsoring a Canadian tour for Finnish education expert Pasi Sahlberg, promoting Finland as the “Global Fourth Way,” and seeking to curtail standardized testing. They are bent on turning back the dreaded “GERM,” the Global Education Reform Movement, supposedly carrying the plague of “neo-liberalism” and its principal strains — higher standards, school choice, and competition in public education.

The Alberta Teachers Association (ATA), armed with a 2012 report written by Sahlberg’s North American ally, Andy Hargreaves, now talks of “transforming Alberta education” with “The Fourth Way, “ and is out to dismantle provincial testing, curtail expanded classroom learning time, and block teacher assessment tied to student performance. More recently, the Finnish wave of “personalized learning” has reached British Columbia.

Finland, like Canada, got a jolt from the 2012 PISA test results. That will finally prompt education observers to acknowledge that Finnish education is fuzzy on standards.  It is, after all,  light on standardized testing, soft on homework, and promotes a “culture of trust” instead of accountability.

Looking deeper, Finland is also a “one provider” system with little or no choice for parents, delays the start of school until age 7, and streams students after Grade 9  into two tracks, academic and vocational, based upon arbitrary average-mark cut-offs.

The Canadian attraction to “discovery learning” and the rush to abandon standardized testing have both hit a significant bump in the road. In the wake of the 2012 PISA results, Canadians are awakening to the dangers of turning back the clock to the days of ‘accountability-free’ public education. Without PISA and the OECD follow-up research studies we are left almost completely in the dark on critical educational quality issues that matter for students and our public schools.

What are the powerful lessons of Canada’s recent decline in PISA test scores?  When will Canadian mathematics educators face reality and come to accept the need to develop a more rigorous, soundly-based curriculum providing a solid grounding in the fundamental skills?  Will Canada come to accept the need to stop being what Paul Cappon aptly termed “a school that never issues report cards”?  And finally, is the real message sinking in?

Read Full Post »