Feeds:
Posts
Comments

Archive for the ‘PISA Test Rankings’ Category

“Canadians can be proud of our showing in the 2015 Programme for International Student Assessment (PISA) report,” declared Science consultant Bonnie Schmidt and former Council of Ministers of Education (CMEC) director Andrew Parkin in their first-off-the mark December 6, 2016 response to the results. “We are, ” they added, “one of only a handful of countries that places in the top tier of the Oganization for Economic Development and Cooperation (OECD) in each of the three subjects tested:science, reading and math.”

pisa2015cmeccover“Canada” and “Canadian students,” we were told, were once again riding high in the once-every-three-years international test sweepstakes. If that that effusively positive response had a familiar ring, it was because it followed the official line advanced by a markedly similar CMEC media release, issued a few hours before the commentary.

Since our students, all students in each of our ten provincial school systems, were “excelling,” then it was time for a little national back-slapping. There’s one problem with that blanket analysis: it serves to maintain the status quo, engender complacency, obscure the critical Mathematics scores, and disguise the lopsided nature of student performance from region to region.

Hold on, not so fast, CMEC — the devil is in the real details and more clearly portrayed in the OECD’s own “Country Profile” for Canada. Yes, 15-year-olds in three Canadian provinces (Alberta, British Columbia, and Quebec) achieved some excellent results, but overall Mathematics scores were down, and students in over half of our provinces trailed-off into mediocrity in terms of performance. Our real success was not in performance, but rather in reducing the achievement gap adversely affecting disadvantaged students.

Over half a million 15-year-olds in more than 72 jurisdictions all over the world completed PISA tests, and Schmidt and Parkin were not alone in making sweeping pronouncements about why Canada and other countries are up and others down in the global rankings.

Talking in aggregate terms about the PISA performance of 20,000 Canadian students in ten different provinces can be, and is, misleading, when the performance results in mathematics continue to lag, Ontario students continue to underperform, and students in two provinces, Manitoba and Saskatchewan, struggle in science, reading, and mathematics.  Explaining all that away is what breeds complacency in the school system.

My own PISA 2015 forecast was way off-base — and taught me a lesson.  After the recent TIMSS 2015 Mathematics results released in November 2016, an  East Asian sweep, led by Singapore and Korea, seemed like a safe bet. How Finland performs also attracts far less attention than it did in its halcyon days back in 2003 and 2006. The significant OECD pivot away from excellence to equity caught me napping and I completely missed the significance of moving (2012 to 2015) from pencil-and-paper to computer-based tests. 

Some solace can be found in the erroneous forcecasts of others. The  recent Alberta Teachers’ Association (ATA) “Brace Yourself” memo with its critique of standardized testing assessment, seemed to forecast a calamitous drop in Alberta student performance levels. It only happened in Mathematics.

Advocates of the ‘Well-Being’ curriculum and broader assessment measures, championed by Toronto’s People for Education, will likely be temporarily thrown off-stride by the OECD’s new-found commitment to assessing equity in education. It will be harder now to paint PISA as evil and to discredit PISA results based upon such a narrow range of skills in reading, math and science.

The OECD’s “Country Profile” of Canada is worth studying carefully because it aggregates data from 2003 to 2015, clarifies the trends, and shows how Canadian students continue to struggle in mathematics far more than in reading and science.

Canadian students may have finished 12th in Mathematics with a 516 aggregate score, but the trend line continues to be in decline, down from 532 in 2003. Digging deeper, we see that students in only two provinces, Quebec ( 544) and BC (522) actually exceeded the national mean score. Canada’s former leader in Mathematics performance, Alberta, continued its downward spiral from the lofty heights of 549 (2003) to 511 (2015).

Since Ontario students’ provincial mathematics scores are declining, experts will be pouring over the latest PISA results to see how bad it is in relation to the world’s top performing systems. No surprises here: Ontario students scored 509, finishing 4th in Canada, and down from 530 on PISA 2003. Excellence will require a significant change in direction.

The biggest discovery in post-2015 PISA analysis was the positive link between explicit instruction and higher achievement in the 2015 core assessment subject, science. The most important factor linked with high performance remains SES (soci0-economic status), but teacher-guided instruction was weighted close behind and students taught with minimal direction, in inquiry or project-based classes, simply performed less well on the global test.

The results of the 15-year-olds are largely determined over 10 years of schooling, and not necessarily the direct consequence of the latest curriculum fad such as “discovery math.’’

It’s better to look deeper into what this cohort of students were learning when they first entered the school system, in the mid-1990s. In the case of Canadian students, for example, student-centred learning was at its height, and the country was just awakening to the value of testing to determine what students were actually learning in class.

Where the student results are outstanding, such as Singapore and Estonia, it is not solely attributable to the excellence of teaching or the rigour of the math and science curriculum.

We know from the “tutoring explosion” in Canada’s major cities that the prevalence of private tuition classes after school is a contributing factor, and may explain the current advantage still enjoyed in mathematics by Pacific Rim students.

Children of Chinese heritage in Australia actually outperformed students in Shanghai on the 2012 PISA test, and we need to explore whether that may be true for their counterparts in Greater Vancouver. The so-called “Shanghai Effect” may be attributed as much to “tiger mothers” as it is to the quality of classroom instruction.

Whether Canada and Canadians continue to exhibit high PISA self-esteem or have simply plateaued does not matter as much as what we glean over the next few years from studying best international practice in teaching, learning, and assessment.

Surveying PISA student results, this much is clear: standing still is not an option in view of the profound changes that are taking place in life, work, and society.

 

Read Full Post »

With the release of the 2015 Program for International Student Assessment (PISA) on the horizon,  the Organization for Economic Cooperation and Development (OECD) Education Office has stoked-up the “Math Wars” with a new study. While the October 2016 report examines a number of key questions related to teaching Mathematics, OECD Education chose to highlight its findings on “memorization,” presumably to dispel perceptions about “classroom drill” and its use in various countries.

mathsubtractionboardThe OECD, which administers the PISA assessments every three years to 15-year-olds from around the globe, periodically publishes reports looking at slices of the data. It’s most October 2016 report,  Ten Questions for Mathematics Teachers and How PISA Can Help Answer Them, based upon the most recent 2012 results, tends to zero-in on “memorization” and attempts to show that high-performing territories, like Shanghai-China, Korea, and Chinese-Taipei, rely less on memory work than lower-performing places like Ireland, the UK, and Australia.

American Mathematics educator Jo Boaler, renowned for “Creative Math,” jumped upon the PISA Study to buttress her case  against “memorization” in elementary classrooms. In a highly contentious November 2016 Scientific American article, Boaler and co-author Pablo Zoido, contended that PISA findings confirmed that “memorizers turned out to be the lowest achievers, and countries with high numbers of them—the U.S. was in the top third—also had the highest proportion of teens doing poorly on the PISA math assessment.” Students who relied on memorization, they further argued, were “approximately half a year behind students who used relational and self-monitoring strategies” such as those in Japan and France. 

Australian education researcher Greg Ashman took a closer look at the PISA Study and called into question such hasty interpretations of the findings.  Figure 1.2: How teachers teach and students learn caught his eye and he went to work interrogating the survey responses on “memorization” and the axes used to present the data.  The PISA analysis, he discovered, also did not include an assessment of how teaching methods might be correlated with PISA scores in Mathematics.  Manitoba Mathematics professor Robert Craigen spotted a giant hole in the PISA analysis and noted that the “memorization” data related to “at-home strategies of students” not their instructional experiences and may wel;l indicate that students who are improperly instructed in class resort to memorization on their own.

mathpisateacherdirectedgraphWhat would it look like, Ashman wondered, if the PISA report had plotted how students performed in relation to the preferred methods used on the continuum from “more student-oriented instruction” to “more teacher-directed instruction.” Breaking down all the data, he generated a new graph that actually showed how teaching method correlated with higher math performance and found a “positive correlation” between teacher-directed instruction and higher Math scores. “Correlations,” he duly noted, “do not necessarily imply causal relationships but clearly a higher ratio of teacher-directed activity to student orientation.”

Jumping on the latest research to seek justification for her own “meta-beliefs” are normal practice for Boaler and her “Discovery Math” education disciples. After junking, once again, the ‘strawmen’ of traditional Mathematics — “rote memorization” and “drill,” Boaler and Zoido wax philosophical and poetic: “If American classrooms begin to present the subject as one of open, visual, creative inquiry, accompanied by growth-mindset messages, more students will engage with math’s real beauty. PISA scores would rise, and, more important, our society could better tap the unlimited mathematical potential of our children.” That’s definitely stretching the evidence far beyond the breaking point.

The “Math Wars” do generate what University of Virginia psychologist Daniel T. Willingham has aptly described as “a fair amount of caricature.” The recent Boaler-Zoido Scientific American article is a prime example of that tendency. Most serious scholars of cognition tend to support the common ground position that learning mathematics requires three distinct types of knowledge: factual, procedural and conceptual. “Factual knowledge,” Willingham points out, “includes having already in memory the answers to a small set of problems of addition, subtraction, multiplication, and division.” While some students can learn Mathematics through invented strategies, it cannot be relied upon for all children. On the other hand, knowledge of procedures is no guarantee of conceptual understanding, particularly when it comes to complexites such as dividing fractions. It’s clear to most sensible observers that knowing math facts, procedures and concepts is  what counts when it comes to mastering mathematics.

mathtimestableimageSimply ignoring research that contradicts your ‘meta-beliefs’ is common on the Math Education battlefield. Recent academic research on “memorization” that contradicts Boaler and her entourage, is simply ignored, even that emanating from her own university. Two years ago, Shaozheng Qin and Vinod Menon of Stanford University Medical School led a team that provided scientifically-validated evidence that “rote memorization” plays a critical role in building capacity to solve complex calculations.

Based upon a clinical study of 68 children, aged 7 to 9, studied over the course of one year, their 2014 Nature Neuroscience study, Qin, Menon et al. found that memorizing the answers to simple math problems, such as basic addition or multiplication, forms a key step in a child’s cognitive development, helping bridge the gap between counting on fingers and tackling more complex calculations. Memorizing the basics, they concluded, is the gateway to activating the “hippocampus,” a key brain structure for memory, which gradually expands in “overlapping waves” to accommodate the greater demands of more complex math.

The whole debate over memorization is suspect because of the imprecision in the use of the term. Practice, drilling, and memorization are not the same, even though they get conflated in Jo Boaler’s work and in much of the current Mathematics Education literature. Back in July 2012, D.T. Willingham made this crucial point and provided some valuable points of distinction. “Practice,” as defined by Anders Ericsson, involves performing tasks and feedback on that performance, executed for the purpose of improvement. “Drilling’ connotes repetition for the purpose of achieving automaticity, which – at its worst, amounts to mindless repetition or parroting. “Memorization,” on the other hand, relates to the goal of something ending up in long-term memory with ready access, but does not imply using any particular method to achieve that goal.

Memorization has become a dirty word in teaching and learning laden with so much baggage to the point where it conjures up mental pictures of “drill and kill” in the classroom. The 2016 PISA Study appears to perpetuate such stereotyping and, worst of all, completely misses the “positive correlation” between teacher-directed or explicit instruction and better performance in mathematics.

Why does the PISA Study tend to associate memorization in home-study settings with the drudgery of drill in the classroom?  To what extent does the PISA Study on Mathematics Teaching support the claims made by Jo Boaler and her ‘Discovery Math’ advocates? When it comes to assessing the most effective teaching methods, why did the PISA researchers essentially take a pass? 

 

Read Full Post »

The Chinese city of Shanghai has a school system that produces students who soar far above the rest.  On the 2009 and 2012 Programme of International Student Achievement (PISA) tests, administered to 15-year-olds worldwide, Shanghai-China students ranked first in all three major domains –mathematics, reading and science.

ShanghaiTeacherUntil recently, the secret of that astounding student achievement success was essentially shrouded in mystery.  With the release of the May 17 World Bank study, “How Shanghai Does It,” the answers are beginning to emerge, providing vitally-important lessons for education policy-makers in Canadian school systems and far beyond.

The World Bank report on Shanghai education, issued by World Bank research director Harry Patrinos, provides a counterpoint to the prevailing narrative that North American school systems should look to Finland for lessons on school improvement. It demonstrates, in incredible detail, what lies behind Shanghai-China’s rise to ‘education super-nova.’

The report, based upon SABER, a comprehensive World Bank system for benchmarking school system performance, delves deeply into how and why Shanghai students achieve excellent learning results. In the process, it smashes a few stubborn stereotypes and dispels the image of a mechanistic, test-driven, joyless educational enterprise.

Shanghai’s student successes stem, according to the World Bank, from a focus on teaching excellence. What’s unique about Shanghai-China is the way it “grooms, supports, and manages” teachers to raise educational quality and to a culture which accords great respect to the “teaching profession.”

We know that Shanghai students break records for extraordinary test scores, but lesser known is the success achieved in raising the floor for overall student achievement. The city has the highest share of disadvantaged students in the top 25 per cent range on PISA tests, and that is no accident. Educational equity is becoming a higher priority, especially targeting children of migrants.

Teachers in Shanghai are, by all accounts, well-trained and mentored after they become licensed to teach in schools. Ongoing professional development is not only offered, as in Canada, but integrated into a “collegial and supportive” professional growth process.  Subject mastery and pedagogical training go together in developing skilled and accomplished teachers.

Teaching time is organized far differently than in Canadian schools.  The Chinese teachers spend only one-third of their time actually teaching and far more emphasis is placed on preparation of demonstration lessons. Teaching effectiveness is the clear priority, not scattered efforts spread across a range of classes.

Teaching is also rewarded far differently.  Instead of being paid on a lock-step grid based upon seniority, Shanghai teachers move up the ladder based upon merit and guided by principals who are trained as instructional leaders not building administrators.

The biggest surprise is how Shanghai’s school system works to reduce educational inequalities. While education funding is vested in the school district, a proportion of the ‘education tax’ is specifically allocated to poor and low performing school districts.

ShanghaiSchoolBBCOne educational innovation worth emulating is what is known as the “entrusted school” management model to help raise up underperforming schools.  High-performing Shanghai schools are “twinned” with struggling schools within the state system. Instead of establishing private schools or creating charters, the Chinese use “twinning” to extend management, training, and resource support to teachers and students in the struggling schools.

Since 2006, the world of education has been enraptured with the so-called “Finnish Miracle,” while Shanghai-China has surged far ahead in student achievement. Instead of hitching our school improvement wagon to Finnish education promoter extraordinaire Pasi Sahlberg and his Finnish lessons, we should be looking at best practice anywhere and everywhere.

Let’s start by finding out where exactly we rank and what might be the areas that need improvement.  We generate lots of national, provincial and international student performance data, so why not put it to better use?

A really bold initiative would be to invite the World Bank to assess one Canadian provincial school system in relation to the SABER benchmarks.  The State of Maryland in the United States has already done so, and the SABER report for Maryland demonstrates just how incredibly valuable it can be in planning for, and advancing, school improvement.

The Finnish Education Miracle has begun to lose its lustre. Perhaps it’s time to consider edutourism junkets to Shanghai instead of Helsinki – in search of educational excellence as well as innovative teaching-learning ideas.

*An earlier version of this Commentary appeared in The Telegraph-Journal, provincial edition, based in Saint John, NB.

Will the World Bank report on Shanghai’s Educational Success be a wake-up call for North American educational leaders? Do popular stereotypes about Chinese education obscure our vision of Shanghai’s remarkable student performance achievements? Should we be producing more detailed studies of “Shanghai Lessons” for educators? And which Canadian province will be the first to follow Maryland in stepping-up to participate in the SABER assessment of school system effectiveness? 

 

Read Full Post »

A lively national conversation is underway in the United States over stalled upward mobility and stark income inequality and it has a more genteel echo in Canada.  Many North American educators point to poverty as the explanation for American students’ mediocre test scores and it also serves as a favoured rationale for explaining away the wide variations in achievement levels among and within Canadian provinces. Only recently have policy analysts, boring down into the PISA 2012 Mathematics data, begun to look at the alarming achievement gap between states and provinces, the relationship between education expenditures and performance levels, and the bunching of students in the mid-range of achievement.

PISA2012CanadaGraphicThe socio-economic determinists offer a simple-minded, mono-causal explanation for chronic student under-performance. American education policy analyst Michael Petrilli and Brandon Wright of The Thomas B. Fordham Institute recently recapped the standard lines: If teachers in struggling U.S. schools taught in Finland, says Finnish educator Pasi Sahlberg, they would flourish—in part because of “support from homes unchallenged by poverty.” Michael Rebell and Jessica Wolff at Columbia University’s Teachers College argue that middling test scores reflect a “poverty crisis” in the United States, not an “education crisis.” Adding union muscle to the argument, American Federation of Teachers president Randi Weingarten calls poverty “the elephant in the room” that accounts for poor student performance.

The best data we have to tackle the critical questions comes from the OECD Program for International Student Assessment (PISA), which just released its annual Education at a Glance 2015 report.  For its own analyses, PISA uses an index of economic, social, and cultural status (ESCS) that considers parental occupation and education, family wealth, home educational resources, and family possessions related to “classical” culture. PISA analysts use the index to stratify each country’s student population into quartiles. That broadens the focus so it’s not just about addressing the under-performance of disadvantaged children.

MathScoresSES2012The PISA socio-economic analysis identifies the key variations among international educational jurisdictions. Countries like Belgium and France are relatively better at teaching their higher-status students, while other countries like Canada and Finland do relatively better at instructing students from lower-status families. Contrary to past assumptions, the United States falls almost exactly on the regression line. It does equally well (or equally poorly, if you prefer) at teaching the least well-off as those coming from families in the top quartile of the ESCS index.

A Fall 2014 Education Next report by Eric Hanushek, Paul Peterson and Ludger Woessmann pointed out the wide variations, country-to-country, in overall Mathematics proficiency.   Some 35 percent of the members of the U.S. class of 2015 (NAEP) reach or exceed the proficiency level in math. Based on their calculations, this percentage places the United States at the 27th rank among the 34 OECD countries. That ranking is somewhat lower for students from advantaged backgrounds (28th) than for those from disadvantaged ones (20th).

Overall assessments of Mathematics proficiency on PISA offer no real surprises. Compared to the U.S., the percentage of students who are math proficient is nearly twice as large in Korea (65%), Japan (59%), and Switzerland (57%). The United States also lags behind Finland (52%), Canada (51%), Germany (50%), Australia (45%), France (42%), and the United Kingdom (41%). Within the U.S., the range is phenomenal – from a high of 51% in Massachusetts to a low of 19 % in Mississippi.

Cross-national comparisons are misleading, because Canadian students have plateaued on the PISA tests over the past decade.  While Canada was still among the high-level achievers, performance of the country’s 15-year-olds in mathematics has declined, with a 14-point dip in the past nine years. While performance in reading has remained relatively stable, the decline in science performance was “statistically significant,” dipping from an average of 534 in 2006 and 529 in 2009.

MathPISA2012RangesMuch like the United States, Canada exhibits significant variations from one provincial school system to another.  A 2013 Canadian Council of Ministers of Education Canada (CMEC) review of the OECD PISA 2012 Mathematics performance levels revealed the stark achievement inequalities. Four Canadian provinces set the pace – Quebec, British Columbia, and Ontario – and the remaining six are a drag on our average scores. Fully 25% of Prince Edward Island students score Below Level 2, below the OECD average (23%), in Mathematics proficiency. The other provinces with the next highest levels of under-performers were: Manitoba (21%), Newfoundland/Labrador(21%), Nova Scotia (18%), and New Brunswick (16%).

There is no case for complacency in Canada, as pointed out, repeatedly, by Dr. Paul Cappon, former CEO of the Canadian Council on Learning (2005-2011) and our leading expert on comparative international standards. For a “high-achieving” country, Canada has a lower proportion of students who perform at the highest levels of Mathematics on recent PISA tests (CMEC 2013, Figure 1.3, p. 25).  Canada’s 15-year-olds are  increasingly bunched in the mid-range and, when it comes to scoring Level 4 and above on Mathematics,  most score at or below the OECD average of 31 %.  The proportion of high-achievers (Level 4 and above in 2012) was, as follows: PEI (22%); Newfoundland/Labrador (27%); Nova Scotia (28%); Manitoba (28%); Saskatchewan (33%); and Ontario (36%). Mathematics students from Quebec continue to be an exception because 48% of students continue to score Level 4 and above, 17 points above the OECD average score.

Students coming from families with high education levels also tend to do well on the PISA Mathematics tests. The top five OECD countries in this category are Korea (73%), Poland (71%), Japan (68%)Germany (64%) and Switzerland (65%), marginally ahead of the state of Massachusetts at 62%. Five other American states have high-achievement level proficiency rates of 58% or 59%, comparable to Czech Republic (58%) and higher than Canada (57%) and Finland (56%). Canada ranked 12th on this measure, well behind Korea, Poland, Japan, Switzerland and Germany.

Educators professing to be “progressive” in outlook tend to insist that we must cure poverty before we can raise the standards of student performance. More pragmatic educators tend to claim that Canadian schools are doing fine, except for the schools serving our disadvantaged populations, particularly Indigenous and Black children.  Taking a broad, international perspective, it appears that both assumptions are questionable. There are really two achievement gaps to be bridged – one between the affluent/advantaged and the poor/disadvantaged and the other one between Canadian high achievers and their counterparts in the top PISA performing countries.

Does low Socio-Economic Status (SES) marked by child and family poverty set the pattern for student achievement in a deterministic fashion?  To what extent can and do students break out of that mold? How critical are other factors such as better teacher quality, higher curriculum standards, and ingrained ethno-cultural attitudes? Do school systems like Canada and Finland tend to focus on reducing educational inequalities at the expense of challenging their high achievers?  Is this the real reason that many leading western G21 countries continue to lag behind those in Asia? 

Read Full Post »

A funny thing has happened to the Organization for Economic Co-operation and Development (OECD) Education Office on its way to the fifth annual International Summit on the Teaching Profession (ISTP 2015). An ambitious international movement, initiated in March 2011 in New York City and dedicated to “Improving Teacher Quality Around the World” now sounds ‘warm and fuzzy’ on those professional issues that really matter – building better teachers, improving classroom instruction, and ensuring teaching effectiveness.

ISTP2015LogoWhen the world’s Education Ministers and over 400 invited delegates from 17 countries arrive at the Banff Springs Hotel on March 29 and 30, the word “test” and the acronym “PISA” will scarcely be heard. Instead of focusing on raising student achievement levels, the OECD Education Bureau has “gone soft” with an ISTP theme and policy paper that soft-pedals raising standards in favour of supporting teachers and building their confidence to prepare students for a rather nebulous “rapidly changing world.”

Mounting criticism of International Test Mania, dubbed “PISAfication,” the rise of a vocal American-led anti-standardized testing movement, and a partnership with Big Teacher, the Education International union federation, have all caused the mastermind of the International Teaching Summits, Andreas Schleicher, Director for Education and Skills, OECD, to change his tune. In place of weighty policy briefs stuffed with OECD student and teacher performance data, we now have a mighty thin 59-page brief spouting rather mundane banalities about supporting teachers in producing “21st-century learners.”

The ISTP 2015 agenda is clearly the work of three influential education experts, the formidable Schleicher, Ontario’s ageless education change wizard Michael Fullan, and Stanford University education professor Linda Darling-Hammond, passed over in 2008 by President Barack Obama in his choice for U.S. Secretary of Education. Two of the three in that troika have spent their careers urging governments to invest in teachers and enhance professional support programs rather than to focus on student and teacher accountability.

Since Canada has no federal Department of Education, alone among the leading OECD countries, the titular head of our national delegation and host of ISTP 2015 will be Alberta Education Minister Gordon Dirks, currently serving as Chair of the Council of Ministers of Education of Canada (CMEC). Dianne Woloschuk, President, Canadian Teachers’ Federation, will be at his side, modelling the collegial partnership model so common in the higher echelons of Canada’s provincial and territorial school systems.

The Alberta Teacher Summit is particularly focused on promoting the so-called “learning partnership” between “education ministers and teacher’s union leaders” and that is obvious from the media releases and invitation lists. While Mike Cooper of the Toronto-based Learning Partnership is on the planning team, the only visible partnerships with business are with the leading “learning corporations” like Pearson International and SMART Technologies who tend to underwrite most of the sessions promoting their systems, products, and curricula.

The Great Powers will be represented by Arne Duncan, United States Secretary of Education, and Hao Ping, Vice Minister of Education, Peoples’ Republic of China, although much of the agenda runs counter to their current ‘higher standards’ educational reform priorities.

Judging from the laudatory treatment of Finland in the ISTP 2015 policy brief, Krista Kiuru, Minister of Education and Science, will be there to provide fresh evidence of the superiority of Finnish teachers and their extraordinary professionalism. Even though Finnish students have slipped on recent PISA tests, that system continues to be the “holy grail” for teachers opposed to regular student testing and school choice of any kind.

Anyone looking for specific policy measures to improve the quality of teaching will be disappointed with the official menu. The ISTP2015 brief and the results of TALIS 2013, the 2013 OECD study of teacher competencies and perspectives, which included 20 teachers in each of 200 schools in Canada, focuses on ways of strengthening teachers’ confidence levels and helping them to overcome “risk-aversion” to innovation.

After five consecutive years of Summitry, it is high time to get into the real nitty-gritty and build actual classroom teachers into the process. From the outside looking in, the Summit resembles a gathering of education ministers and system insiders who purport to know what’s best for teachers as well as students in today’s classrooms. In other words, a high altitude “risk-free” summit.

Two fundamental questions arise: Whatever happened to all the recent independent research calling for major reforms to teacher education, professional standards, and classroom accountability? And most importantly, where are the exemplary classroom teachers on that star-studded international guest list?

Read Full Post »

Today the Organization for Economic Development and Cooperation (OECD) has succeeded in establishing the Program of International Student Assessment (PISA) test and national rankings as the “gold standard” in international education. Once every three years since 2000, PISA provides us with a global benchmark of where students 15 years of age rank in three core competencies — reading, mathematics, and science. Since its inception, United States educators have never been enamoured with international testing, in large part because American students rarely fare very well.

PISATestVisualSo, when the infamous OECD PISA Letter was published in early May 2014 in The Guardian and later The Washington Post, the academics and activists listed among the initial signatory list contained the names of some familiar American anti-testing crusaders, such as Heintz-Deiter Meyer (SUNY, Albany), David Berliner (Arizona State University), Mark Naison (BAT, Fordham University), Noam Chomsky (MIT) and Alfie Kohn, the irrepressible education gadfly. That letter, addressed to Andreas Schleicher, OECD, Paris, registered serious concerns about “the negative consequences of the PISA rankings” and appealed for a one cycle (three-year) delay in the further implementation of the tests.

The global campaign to discredit PISA earned a stiff rebuke in Canada. On June 11 and June 18, 2014, the C.D. Howe Institute released two short commentaries demonstrating the significant value of PISA test results and effectively countering the appeal of the anti-PISA Letter. Written by Education Fellow John Richards the two-part report highlighted the “Bad News” in Canada’s PISA Results and then proceeded to identify What Works (specific lessons to be learned) based upon an in-depth analysis of the once every three-year tests. In clear, understandable language, Richards identified four key findings to guide policies formulated to “put Canadian students back on track.”

The call for a pause in the PISA tests was clearly an attempt to derail the whole international movement to establish benchmarks of student performance and some standard of accountability for student achievement levels in over 60 countries around the world. It was mainly driven by American anti-testers, but the two Canadian-based signatories were radical, anti-colonialist academics, Henry Giroux (English and Cultural Studies, McMaster University) and Arlo Kempf ( Visiting Professor, Program Coordinator, School and Society, OISE).

Leading Canadian educationists like Dr. Paul Cappon (former CEO, Council on Learning) and even School Change guru Michael Fullan remain supporters of comparative international student assessments. That explains why no one of any real standing or clout from Canada was among the initial group, and, by late June, only 32 Canadian educationists could be found among the 1988 signatories from all over the globe. Most of the home-grown signatories were well known educators in what might be termed the “accountability-free” camp, many like E. Wayne Ross (UBC) and Marc Spooner (U Regina), fierce opponents of “neo-liberalism” and its supposed handmaiden, student testing.

John Richards’ recent C.D.Howe commentaries should, at least temporarily, silence the vocal band of Canadian anti-testers.  His first commentary made very effective use of PISA student results to bore deeply into our key strengths and issues of concern, province-by-province, focusing particularly on student competencies in mathematics. That comparative analysis is fair, judicious, and research-based in sharp contrast to the honey-coated PISA studies regularly offered up by the Council of Ministers of Education (Canada).

The PISA results tell the story. While he finds Canadian students overall “doing reasonably well,”  the main concern is statistical declines in all provinces in at least one subject, usually either mathematics or reading.  Quebec leads in Mathematics, but in no other subject.  Two provinces (PEI and Manitoba) experienced significant declines in all three subject areas. Performance levels have sharply declined ) over 30 points) in mathematics in both Manitoba and Canada’s former leader, Alberta. Such results are not a ringing endorsement of the Mathematics curriculum based upon the Western and Northern Canada Protocol (WNCP). 

The warning signs are, by now, well known, but the real value in Richards’ PISA Results analysis lies in his very precise explanation of the actual lessons to be learned by educators.  What really matters, based upon PISA results, are public access to early learning programs, posting of school-level student achievement results, paying professional level teacher salaries, and the competition provided by achievement-oriented private and  independent (not for profit) schools. Most significantly, his analysis confirms that smaller class sizes (below 20 pupils per class) and increasing mathematics teaching time have a negligible effect on student performance results.

The C.D. Howe PISA Results analysis hit home with The Globe and Mail, drawing a favourable editorial, but was predictably ignored by the established gatekeepers of Canada’s provincial education systems. Why the reluctance to confront such research-based, common sense findings?  “Outing” the chronic under-performance of students from certain provinces ( PEI, Manitoba, New Brunswick, and Nova Scotia) is taboo, particularly inside the tight CMEC community and within the self-referenced Canadian Education Association (CEA) circles.  For the current Chair of CMEC, Alberta Education Minister Jeff Johnson any public talk of Alberta’s precipitous decline in Mathematics is an anathema.

Stung by the PISA warning shots, Canada’s provincial education gatekeepers tend to be less receptive to sound, research-based, practical policy correctives. That is a shame because the John Richards reports demonstrate that both “sides” in the ongoing  Education War are half-right and by mixing and matching we could fashion a much more viable, sustainable, effective policy agenda. Let’s tear up the existing and tiresome Neo-Con vs. Anti-Testing formulas — and re-frame education reform around what works – broader access to early learning, open accountability for student performance levels, paying respectable, professional-level teacher salaries, and welcoming useful competition from performance-driven private and independent schools.

What’s the  recent American Public Noise over “PISAfication” all about anyway?  Why do so many North American educators still tend to dismiss the PISA Test and the sound, research-based studies stemming from the international testing movement?  To what extent do John Richards’ recent C.D. Howe Institute studies suggest the need for a total realignment of provincial education reform initiatives?

 

 

Read Full Post »

Measuring what matters in education is a vitally important public policy issue fraught with controversy. Since 2000, the Organization for Economic Cooperation and Development (OECD) has succeeded through the Program of International Student Assessment (PISA) in establishing the global benchmark for student achievement in the fundamentals of reading, mathematics and science. Over sixty countries have come together to support student achievement testing and most participating nations have developed comparable national and state/provincial cyclical assessment programs. That global consensus is now under fire by a revivified movement of  North American educators purporting to be ‘education progressives.’

OntarioStudentVision2014“Measuring What Matters” movement has arisen attempting to “broaden the measures of success,” but essentially committed to either “soften” the standards or banish standardized testing all together.  The Ontario Broader Success project, initiated by Annie Kidder and People for Education, is in the vanguard of the attempt to water down student testing by incorporating “softer” competencies and socially progressive attitudes.  A growing band of North American education progressives, endorsed by education gadfly Alfie Kohn, issued a May 6, 2014 OECD PISA Letter objecting to ” the negative consequences of the PISA rankings” and claiming that “measuring a great diversity of educational traditions and cultures using a single, narrow, biased yardstick could, in the end, do irreparable harm to our schools and our students.”

The real agenda of the Canadian insurgency is to broaden the definition of student success and chip away at the foundation of student testing and public accountability.   In June 2013, People for Education released a Broader Measures of Success report which gave a clearer picture of the end game.  Building upon its long-held skepticism about testing, Kidder and P4E announced a five-year project to “broaden the definition of school success” to encompass more than “literacy and numeracy.”  The report, produced by researcher Kelly Gallagher-Mackay, proposed a new framework of six domains, only one of which was related to “academic achievement.”  Indeed, the P4E model attempted to sublimate academic achievement in the pursuit of five other goals:  physical and mental health, social-emotional development, creativity and innovation, and school climate.

One of the most credible proponents of the Broader Success agenda is Dr. Charles Ungerleider, a UBC professor and former BC Minister of Education.  Much of the substance of the critique comes from Dr. Ungerleider, a well compensated educational consultant committed to empowering teachers and thereby improving instruction.  In a very revealing BC Public Affairs show, Your Education Matters with Dr. Paul Shaffer, Ungerleider laid bare the goals of the  movement. “We should broaden the definition of success on a system-wide basis,” he stated. ” We can assess a student’s moral framework…evaluate the level of social responsibility…and evaluate compassion for fellow human beings.”

Ungerleider claims to support student testing, but he is adamantly opposed to “the misuse of (student performance) information.”  Ranking schools based upon student results qualifies as “a misuse of information”  perpetrated by think tanks like the Fraser Institute and AIMS. Promoting a broader concept of school success is, he advises Shaffer, the best way to “educate the public about what’s wrong with school rankings.”

The Broader Success movement is going all out to win the support of Canadian teachers unions like the Alberta Teachers Association.  On March 27, 2014, the ATA Magazine virtually endorsed their approach by publishing a short column written by Kidder, Gallagher-Mackay and Ungerleider. It appealed to teachers who are generally allergic to student testing and accountability. “By changing what is measured, ” the trio wrote, ” the initiative will support positive change in schools and make more room for the curriculum, programs and resources that support health, creativity, citizenship, social-emotional skills and positive school climate.”   All three of them repeated that message in a May 26, 2013 presentation at the Canadian Society for the Study of Education Conference at Brock University.

The Ontario Government appears to be listening to the Broader Success advocates, judging from the April 2014 policy statement Achieving Excellence: A Renewed Vision for Education in Ontario Consistent with the Dalton McGuinty-Kathleen Wynne policy orientation, the new direction document attempts to move beyond instilling the fundamentals and embraces the pursuit of “soft” competencies and skills. Achieving excellence as measured in PISA  reading and mathematics scores remains first in priority, but the Ministry of Education is now tilting in the direction of “ensuring equity” and “promoting well-being.”

Where is Ontario public education heading?  The Achieving Excellence policy statement provides a few clues. It appears that Ontario, trading in on its claim to be one of “the world’s highest performing school systems,” is now flirting with the Broader Success policy panacea. Annie Kidder and People for Education no longer qualify as “outsiders” and have succeeded in burrowing into the Ontario education establishment.   With Dr. Ben Levin out-of-commission and Dr. Michael Fullan in a 21st Century Learning orbit, the system has lost its moorings and pinning down its future direction is purely a matter of speculation.

Focusing on student educational deficits can become the system-wide raison d’etre in the absence of clear aspirational standards.  That is the focus of  Ungerleider and People for Education. The highly successful Educational Quality and Accountability Office (EQAO) is no longer in the forefront, and that is a bad omen.  Recent research by Australian John Guenther pointing out the value of assessing the social capital of school-community partnerships and the effectiveness of alternative special education programs for at-risk children are lost on the Ontario educational insiders. So are legitimate concerns raised about the costs of rebuilding a complete battery of system-wide “soft” measures. Where student assessment standards whither and public accountability falters, mediocrity is not far behind.

Why are North American ‘neo-progressive’ educators abandoning academic standards and looking to broaden or kill the PISA assessments?  What is the real purpose of Ontario’s People for Education initiative promoting Broader Success measures for students and schools?  To what extent is that initiative motivated by the desire to return to an “accountability-free ” school system?  Can moral standards and social responsibility be quantified, and — if so- for what purpose?  Finally, will any of these changes produce students who are better educated, productive, resilient, and prepared to thrive in the 21st century?

Read Full Post »

Older Posts »