Feeds:
Posts
Comments

Posts Tagged ‘Andreas Schleicher’

Six years ago, a positively gushing August 2017 BBC News story anointed Canada as an “education superpower” on the basis of its recent Program of International Student Assessment (PISA) scores in mathematics, science and reading. Today, after the release of the PISA 2022 assessments, such a claim would be dismissed as preposterous.

On the latest round of tests for 15-year-olds, Canadian students continued to slide in mathematics, reading and science.  In Mathematics, the prime focus of the 2022 global assessment, our students dropped again from 512 in 2018 to 497 in 2022, a 35-point decline since 2003. Concerns about taking a “COVID Hit” raised in my November 29 research report, Pandemic Fallout (Cardus Foundation) were borne out in the latest scores.

Our national education agency, the Council of Ministers of Education (CMEC), demonstrated, once again, its tendency to denialism. Crowing about finishing in 9th place in Mathematics means little when we are steadily losing ground to the global leaders, the Asian powerhouses of Singapore, Macau, China (Taipei), Hong Kong, Japan, and Korea, and three fast improving European states, Estonia and Switzerland.

What’s more concerning is that the decline is consistent from 2000 to the present right across the board in reading and science as well as mathematics. Canadian student skills in reading mirror the downward trend, dropping from 524 in 2009 to 507 in 2022, a decline of 17 points. It marginally better in science but still consistent with the pattern of steady decline over the past two decades.

Apologists for Canada’s declining performance are running out of rationalizations.  Cherry-picking the mathematics data the best the CMEC communications team could come up with is that some 78 per cent of Canadian students achieved Level 2, signifying that they are functionally numerate. The overall decline in mathematics, reading and science is, rather sadly, explained away because of the “trend seen in the majority of participating countries and economies.”

The PISA 2002 Study report is a rather thick, almost impenetrable, quantitative research study that takes weeks to digest and analyze even for veteran researchers. The Canadian national sub-report, Measuring Up: Canadian Results, OECD PISA 2022, is helpful in summarizing Canadian student performance levels with provincial/territorial breakdowns.

The pandemic fallout was expected, but all the PISA 2022 results did is accentuate and accelerate the longer-term downward slide.  It’s serious when the OECD Education tsar, Andreas Schleicher, describes the Canadian student decline in mathematics as a legitimate concern.

Two Canadian provinces, Alberta and Quebec, are responsible for keeping our PISA results from being a “mission-critical crisis.”   In Mathematics, Quebec students continue to head the class, scoring 514, some 10 percentage points above Alberta. When it comes to Reading, Alberta leads the pack at 525, albeit down from 532 in 2018.  All of the Atlantic provinces tanked on the PISA 2022 tests in Mathematics with Newfoundland (459), New Brunswick (468), and Nova Scotia (470) falling below the OECD average.

Canada’s “learning province,” Ontario is in a slow downward spiral, in spite of its mammoth education budget.  On PISA 2022, Ontario students sunk to new lows in Mathematics registering a 495, down 35 points over 20 years. Reading scores in Ontario were better at 512, but some 19 points below 2009. With the Ontario Right to Read reforms underway in K to Grade 3, student reading competencies should be higher from 2028 onward, when that initial cohort turns 15-years-of age.

Learning loss is real and the pandemic generation has not rebounded.  What the PISA 2022 student scores reveal is that, in Canada, and worldwide, it there is a “significant learning deficit” and it continues four years after the COVID-19 outbreak and massive school disruption. Recognizing the problem is the first step in shattering the complacency and getting past the “pandemic fatigue.”

Getting our students ‘back-on-track’ will take courageous educational leadership, significant changes in school culture, implementation of the “science of learning” in classrooms, and new policies aimed at reclaiming the minds of students far too absorbed in cyberworlds. It’s all a matter of improving the effectiveness of classroom instruction and being better prepared for the next major disruption in the years ahead.

Why does the public release of PISA student achievement results attract so much global attention?  How have the PISA scores in mathematics and reading become proxies for the quality of school systems? Whic are the most important revelations – the actual scores, country rankings, or the longer-term trends?  Do we focus too much on the math and reading scores and miss out on some potentially more significant findings buried in the technical reports?  

Read Full Post »

 

The Program of International Student Assessment (PISA), managed by Andreas Schleicher and the OECD Education Skills Office in Paris, France, is still regarded as the “gold standard” in comparative student assessment and enjoys a rather charmed life. Every three years, educational leaders, commentators, and engaged teachers eagerly await the results of student testing and its so called ‘league tables’ ranking the performance of 15-year-olds from some 79 participating jurisdictions. A new book, Dire Straits: Education Reforms, Ideology, Vested Interests and Evidence, produced by two Spanish researchers, Montserrat Gomendio and José Ignacio Wert, is sure to rock that edifice and punch holes in the credibility of the OECD’s education branch.

Student assessment and accountability are essential and yet elusive in global K-12 education, both within countries and internationally, so school reformers put faith in ILSAs like PISA to provide solid evidence on how students were actually performing in the core skills of reading, mathematics and science. Across the globe, educational leaders and policy-makers looked to PISA to provide evidence and guidance to allow us to form a consensus on what works in different countries, and particularly on what can be learned from student achievement gains in top-performing nations. That has not happened according to one of the book’s authors, Montserrat Gomendio, OECD’s former deputy director for education and head of its Centre for Skills. It’s all spelled out in a devasting critique in the current Winter 2023 edition of Education Next.

PISA is OECD Education’s crown jewel in an organization dedicated to providing reliable data and policy advice, encouraging comparative analysis and learning exchanges worldwide.  From the first cycle of PISA (2000) to the last (2018), the number of participating countries increased from a rather homogeneous group of 32 OECD countries to some 79, owing largely to the addition of many low- and middle-income countries. Flush with its own success, the OECD made a boastful claim: “PISA has become the world’s premier yardstick for evaluating the quality, equity and efficiency of school systems, and an influential force for education reform.”

PISA’s own data tells the tale. “After almost two decades of testing, student outcomes have not improved overall in OECD nations or most other participating countries,” according to Gomendio. She recognizes that, up until 2018, a global recession, the rise of social media, and environmental disasters did present “headwinds for school-improvement efforts.” Failing to achieve its mission, she points out, led to “blame games.” That was precipitated by the dawning realization that student outcomes had flatlined from 2000 to 2018. In response, OECD Education officials pointed fingers at its own member states for not taking advantage of the PISA data and carrying out the recommended policy changes.

Policy recommendations from PISA are built upon two different approaches – quantitative analyses of student outcomes and a range of features of education systems and qualitative analyses of low- and top-performing countries. It is commonly agreed PISA’s quantitative analyses of cross-sectional samples and correlations cannot be used to draw causal inferences. It’s qualitative analyses, particularly with regard to Nordic countries, also suffer from serious drawbacks such as cherry-picking. Other weaknesses, she points out in Education Next, have gone largely unnoticed.  One of the biggest question marks is the reliability of student results on such “low stakes” tests. In the case of Australia, for example, the National Council on Educational Research (NCER) found that a majority of Australian students (73%) may not have taken the PISA test seriously and would have invested more effort if it counted towards their marks.

Quality and Equity – Confronting the Contradictions

PISA seeks to measure two complementary dimensions of education systems: quality and equity. Measuring quality on the basis of average student test scores is far easier than assessing equity. To do so, PISA employs a multidimensional concept using metrics such as the relationship between socioeconomic status and student performance, the degree of differences in student performance within and between schools, and many others. None of these variables, Gomendio points out, “tell the full story” and “each of them leads to different conclusions.” So, ultimately PISA’s prism on equity is ultimately too narrow and somewhat unreliable.

PISA’s analysis of school choice and policy recommendations on that issue draw fire from Gomendio and Wert. Claims that students in private schools do not perform better that those in public schools (after correcting for socioeconomic status), are problematic. Analyses lumping private schools together with government-funded, privately managed charter schools skews the results. It also makes it impossible to disaggregate the data. That explains why PISA analyses are at odds with other international assessments, as well as research studies, which show that “school choice often does lead to better student outcomes without necessarily generating segregation.” In addition, the small number of countries with early tracking (streaming into academic and applied/vocational) show “little (if any) differences in student performance and employability rates for vocational-education students.” It is clear that PISA would benefit from thinking outside the box, paying attention to academic research and looking at the broader picture.

            The new book Dire Straits, written by two Spanish researchers, confronts squarely PISA’s implicit bias in favour of Finland and other Nordic countries. The authors are particularly critical of PISA’s analyses of Finland and Germany. In PISA’s first cycle, they call into question the lionizing of Finland for its “quality and equity” and labelling of Germany as a “heavily tracked system” that promoted inequity and “should be avoided.”  

Nordic societies like Finland do get a free ride with PISA because they were egalitarian long before the inception of PISA. Egalitarian societies like Finland possess natural advantages since teachers work with a more uniform student population and are better positioned to implement inclusive policies across the board to all students. More stratified societies in Europe and Latin America, for example, require more differentiated approaches to meet the needs of the full spectrum of students. More recognition should be accorded to stratified societies with income inequalities that tend to have bigger challenges closing the equity gap. In the case of Canada, for example, it is useful to examine how our country manages to maintain reasonable student achievement standards, while alleviating the equity gap, particularly in relation to the United States.

Identifying Exemplars, Applying the Right Lessons

PISA completely missed the boat on the rise of student outcomes in Singapore and its East Asian neighbours and the relative decline of Finland. A few decades ago, Singapore had an illiterate population and very few natural resources. The country made a decision to invest in human capital as the engine of economic growth and prosperity, and, in a few decades, it became the top performer in all international assessment programs. Part of that improvement can be attributed to implementing tracking in primary school in an effort to decrease its high dropout rate. Once this was achieved, the country delayed tracking until the end of primary school. So far, PISA has not provided a coherent and differentiated analysis of the “Singapore Miracle.”

Teacher quality is more salient than PISA recognizes in its analyses. In the case of Singapore and the East Asian countries only top-performing students can enter education-degree programs, whereas poorer performing Latin American countries tend to have teachers drawn from the weaker academic ranks. Professional recruitment programs are mostly weak and teacher evaluation mechanisms almost non-existent.  Teacher unions are not always helpful in improving the quality of instruction.  In the case of Latin America, teacher unions exercise considerable clout and have succeeded in securing lower class sizes, generating more teaching positions. Top-performing East Asian countries, on the other hand, tend to have weaker unions and there are, consequently, fewer political costs involved in running larger class sizes or in implementing rigorous teacher evaluation systems.  Increases in education spending get invested in reducing class sizes, contrary to the PISA recommendation, and in the face of robust evidence that it does not improve student outcomes.

Conclusions

            Ideology, education governance and conflicts of interest all serve to undermine the overall effectiveness of evidence-based, PISA-informed policy prescriptions. Education authorities tend to be risk-averse when it comes to implementing targeted policy prescriptions and resisting the pressures to increase spending levels, driven by the core interests, most notably local education authorities and teacher unions.

Three key lessons jump out in the latest book on PISA. First, decreases in class size and increases in teacher salaries do not work in improving student achievement but such policy recommendations run headlong into the vested interests of unions and the preference of active parents alert to any diminution in the amount of resources received from public funds. Secondly, some influential factors are “strongly context-dependent” (such as school autonomy and site-based management) and are difficult for policymakers to interpret. In such cases, applying policies universally can yield dire consequences. Finally, attempts to measure equity, including those of PISA analysts, tend to be inconclusive and partial, leading to recommendations more often than not “heavily influenced by ideology.”  This has led to a universal recommendation to apply comprehensive policies and avoid those that are regarded as ‘discriminatory’ (such as ability grouping and early tracking). Such policies lead to the worst outcomes in terms of equity among more stratified societies.

Pointing fingers and apportioning blame has become all-too-common in OECD’s highly influential PISA reports.  What’s clear from the latest critique, levelled by the two former PISA insiders, is that flatlined student outcomes and policy shortcomings have much to do with PISA’s implicit biases (ideology), structural impediments (union advocacy), and conflicts of interest (service provider capture). That is why, according to the critics, PISA is failing in its mission.

Judging from the latest book, PISA has made little difference in improving school systems.  Is PISA failing in its mission? With so much evidence from student testing, why do education systems tend to brush aside the runaway success of top-performing Asian countries and, perhaps most importantly, why do so many systems continue to struggle?

 

Read Full Post »