Feeds:
Posts
Comments

Archive for February, 2023

The worst kept educational secret is leaking out: most Canadian K-12 students in all provinces suffered setbacks during the Pandemic.  The latest province to report on the decline in student test scores is Nova Scotia, a middling Canadian province widely considered a bell weather for national trends. Right on forecast, that province’s students performed dismally on the latest 2021-22 battery of results.  Alarming student test score numbers in reading, writing and mathematics generated considerable media attention, but it remains to be seen whether they will light a fire under the gatekeepers of the provincial schoolhouse.

One in three Grade 3 students (32 per cent) cannot read with comprehension, and half of those students cannot write properly. It doesn’t get better by Grade 6 in reading or mathematics.  Two out of five in Grade 10 fail to meet acceptable standards in mathematics. This is not new at all, just worse because of school shutdowns, periodic interruptions, and absenteeism.

Signs of flagging student progress are everywhere in that province’s classrooms. Students are still guessing at words while reading in the early grades. Most elementary kids are rarely asked to write more than a sentence or two. Left on their own to master mathematics, students’ skills have eroded to an alarming degree. Getting kids to turn off their cellphones saps a lot of energy.

Confronting the hard data on the downward spiral, Education Minister Becky Druhan and the Department were quick to blame the pandemic.  Abysmal post-COVID student test scores were posted, the pandemic was offered up as the explanation, and –two days later — a reactive plan materialized out of thin air.

The “education crisis” escape plan was thrown-together in reaction mode. Provincial education officials must have been banking on no one bothering to look any deeper, track student data trends, or question why the department is still entrusted with evaluating its own effectiveness in teaching, learning and curriculum

Reading and writing skills have actually been in steady decline for a decade or more. Some 68 per cent of Grade 3 students in 2021-22 met minimum standards in Reading, down 8 points from 76 per cent in 2012-13. Student writing standards in Grade 3 have deteriorated significantly in all aspects of writing proficiency (Ideas – from 88% to 50%; Organization -from 80% to 38%; Language Use – from 83% to 43%; and Conventions – from 71% to 32%). Two out of three Grade 3s are familiar with Snapchat but exhibit little proficiency in  grammar or spelling and most can barely write a complete sentence.

Student proficiency by Grade 6 is critical because, as the recent October 2022 World Bank report on Pandemic Global Learning Loss claimed, students unable to read by 10 years-of-age are considered to be living in “learning poverty.” Until recently, that problem seemed far removed from the lives of Nova Scotian and Canadian children.

Six out of 10 kids in the world’s low-income and middle-income countries are now classified as “learning poor” putting their future in jeopardy and their lives at risk. In Canada, the World Bank estimates that from 4.3 to 8.3 per cent of 10 year olds in Canada qualify as “learning poor.” It’s much higher in Nova Scotia, where 29 per cent of our 10-year-olds (in Grade 6) lack basic proficiency in reading.

Math standards tend to fly below the radar in Nova Scotia, and the Education Department is culpable. Thirty per cent of Grade 3s lack proficiency in math skills, but it’s impossible to track past trends.  Shifting the tests from Grade 3 to Grade 4 and back again since 2011-12 deprived us of comparable data. It’s not as concealed in Grade 6 where student scores have dropped from 73 per cent (2012-13) to 64 per cent a year ago. One third of Grade 6s fall below provincial math standards.

Buried in the latest batch of published results are “disaggregated” student test results for two groups of students, those of African heritage and Indigenous ancestry.  That reflects the department’s recent focus on supporting students and improving results among those in racialized and marginalized communities.

While it’s been a major priority, the pandemic disruption has wiped out previous gains. Grade 3 Reading scores for African students held firm at 57 per cent meeting standards, some 12 per cent below the provincial average score. Writing remains a serious problem with fewer than half of the cohort of 695 students meeting expectations. A similar sized cohort of Mi’kmaw/Indigenous students in Grade 3 suffered similar setbacks during the pandemic.  In high school, African and Indigenous students at Grade 10 level performed far better in Reading than in Mathematics, where both cohorts of students have lost significant ground in comparison with their peers.

So far, Druhan and her Department have fumbled the ball during the pandemic disruption.  Cancelling school for 22 weeks between March 2020 and June 2021 put students and teachers in a much-weakened position. Since then, provincial authorities have been essentially asleep, waiting – it now appears – for hard evidence that students, at every grade level, are far behind in their progress and poorly prepared to progress to the next level.

Nowhere is the Education department’s ‘muddle-through’ mentality better exemplified than in in its slow-footed, ad hoc response to the deepening literacy crisis. After ignoring the Ontario Human Rights Commission Right to Read report upon its release, Druhan and her officials finally – six months later– produced a “Six Pillars” framework for discussion in June of last school year. The document endorsing ‘structured literacy’ was issued, but implementation was voluntary and earmarked for a number of “pilot schools.”

Provincial literacy experts were taken-aback when the “Six Pillars” framework surfaced again, in the immediate aftermath of the disastrous scores. Conventional reading and writing strategies, including “balanced or levelled literacy” and “Reading Recovery” remain in place, even though they were rejected months ago in Ontario and other provinces. The just-announced “new plan” for Grade 2 literacy is nothing of the sort. After keeping the “Six Pillars” under wraps, it’s just now being introduced to teachers, delaying implementation for another full year.

Establishing a Nova Scotia Student Progress Assessment agency is now mission-critical in Primary to Grade 12 education. Learning erosion has worsened since January 2018 when Dr. Avis Glaze recommended creating such an agency reporting to the public, not the department. Delaying the release of student test data, resisting evidence-based policy making, and denying the pandemic’s impact may be the last straw. The department should not be entrusted with evaluating the success of its own policies, curriculum and practices. It’s high time for more public accountability and action plans informed by the best evidence gathered through student assessment.

Why are education authorities blaming the “learning erosion” on the Pandemic disruption and treating it as an aberration? How representative is Nova Scotia, where literacy and mathematics skills have been in decline for a decade or more?  What is the point of establishing ‘learning outcomes’ without implementing changes which might enable teachers to come closer to meeting those student achievement benchmarks? Is the irregular and uneven response to the Ontario Right to Read inquiry findings symptomatic of broader concerns?

Read Full Post »

 

The Program of International Student Assessment (PISA), managed by Andreas Schleicher and the OECD Education Skills Office in Paris, France, is still regarded as the “gold standard” in comparative student assessment and enjoys a rather charmed life. Every three years, educational leaders, commentators, and engaged teachers eagerly await the results of student testing and its so called ‘league tables’ ranking the performance of 15-year-olds from some 79 participating jurisdictions. A new book, Dire Straits: Education Reforms, Ideology, Vested Interests and Evidence, produced by two Spanish researchers, Montserrat Gomendio and José Ignacio Wert, is sure to rock that edifice and punch holes in the credibility of the OECD’s education branch.

Student assessment and accountability are essential and yet elusive in global K-12 education, both within countries and internationally, so school reformers put faith in ILSAs like PISA to provide solid evidence on how students were actually performing in the core skills of reading, mathematics and science. Across the globe, educational leaders and policy-makers looked to PISA to provide evidence and guidance to allow us to form a consensus on what works in different countries, and particularly on what can be learned from student achievement gains in top-performing nations. That has not happened according to one of the book’s authors, Montserrat Gomendio, OECD’s former deputy director for education and head of its Centre for Skills. It’s all spelled out in a devasting critique in the current Winter 2023 edition of Education Next.

PISA is OECD Education’s crown jewel in an organization dedicated to providing reliable data and policy advice, encouraging comparative analysis and learning exchanges worldwide.  From the first cycle of PISA (2000) to the last (2018), the number of participating countries increased from a rather homogeneous group of 32 OECD countries to some 79, owing largely to the addition of many low- and middle-income countries. Flush with its own success, the OECD made a boastful claim: “PISA has become the world’s premier yardstick for evaluating the quality, equity and efficiency of school systems, and an influential force for education reform.”

PISA’s own data tells the tale. “After almost two decades of testing, student outcomes have not improved overall in OECD nations or most other participating countries,” according to Gomendio. She recognizes that, up until 2018, a global recession, the rise of social media, and environmental disasters did present “headwinds for school-improvement efforts.” Failing to achieve its mission, she points out, led to “blame games.” That was precipitated by the dawning realization that student outcomes had flatlined from 2000 to 2018. In response, OECD Education officials pointed fingers at its own member states for not taking advantage of the PISA data and carrying out the recommended policy changes.

Policy recommendations from PISA are built upon two different approaches – quantitative analyses of student outcomes and a range of features of education systems and qualitative analyses of low- and top-performing countries. It is commonly agreed PISA’s quantitative analyses of cross-sectional samples and correlations cannot be used to draw causal inferences. It’s qualitative analyses, particularly with regard to Nordic countries, also suffer from serious drawbacks such as cherry-picking. Other weaknesses, she points out in Education Next, have gone largely unnoticed.  One of the biggest question marks is the reliability of student results on such “low stakes” tests. In the case of Australia, for example, the National Council on Educational Research (NCER) found that a majority of Australian students (73%) may not have taken the PISA test seriously and would have invested more effort if it counted towards their marks.

Quality and Equity – Confronting the Contradictions

PISA seeks to measure two complementary dimensions of education systems: quality and equity. Measuring quality on the basis of average student test scores is far easier than assessing equity. To do so, PISA employs a multidimensional concept using metrics such as the relationship between socioeconomic status and student performance, the degree of differences in student performance within and between schools, and many others. None of these variables, Gomendio points out, “tell the full story” and “each of them leads to different conclusions.” So, ultimately PISA’s prism on equity is ultimately too narrow and somewhat unreliable.

PISA’s analysis of school choice and policy recommendations on that issue draw fire from Gomendio and Wert. Claims that students in private schools do not perform better that those in public schools (after correcting for socioeconomic status), are problematic. Analyses lumping private schools together with government-funded, privately managed charter schools skews the results. It also makes it impossible to disaggregate the data. That explains why PISA analyses are at odds with other international assessments, as well as research studies, which show that “school choice often does lead to better student outcomes without necessarily generating segregation.” In addition, the small number of countries with early tracking (streaming into academic and applied/vocational) show “little (if any) differences in student performance and employability rates for vocational-education students.” It is clear that PISA would benefit from thinking outside the box, paying attention to academic research and looking at the broader picture.

            The new book Dire Straits, written by two Spanish researchers, confronts squarely PISA’s implicit bias in favour of Finland and other Nordic countries. The authors are particularly critical of PISA’s analyses of Finland and Germany. In PISA’s first cycle, they call into question the lionizing of Finland for its “quality and equity” and labelling of Germany as a “heavily tracked system” that promoted inequity and “should be avoided.”  

Nordic societies like Finland do get a free ride with PISA because they were egalitarian long before the inception of PISA. Egalitarian societies like Finland possess natural advantages since teachers work with a more uniform student population and are better positioned to implement inclusive policies across the board to all students. More stratified societies in Europe and Latin America, for example, require more differentiated approaches to meet the needs of the full spectrum of students. More recognition should be accorded to stratified societies with income inequalities that tend to have bigger challenges closing the equity gap. In the case of Canada, for example, it is useful to examine how our country manages to maintain reasonable student achievement standards, while alleviating the equity gap, particularly in relation to the United States.

Identifying Exemplars, Applying the Right Lessons

PISA completely missed the boat on the rise of student outcomes in Singapore and its East Asian neighbours and the relative decline of Finland. A few decades ago, Singapore had an illiterate population and very few natural resources. The country made a decision to invest in human capital as the engine of economic growth and prosperity, and, in a few decades, it became the top performer in all international assessment programs. Part of that improvement can be attributed to implementing tracking in primary school in an effort to decrease its high dropout rate. Once this was achieved, the country delayed tracking until the end of primary school. So far, PISA has not provided a coherent and differentiated analysis of the “Singapore Miracle.”

Teacher quality is more salient than PISA recognizes in its analyses. In the case of Singapore and the East Asian countries only top-performing students can enter education-degree programs, whereas poorer performing Latin American countries tend to have teachers drawn from the weaker academic ranks. Professional recruitment programs are mostly weak and teacher evaluation mechanisms almost non-existent.  Teacher unions are not always helpful in improving the quality of instruction.  In the case of Latin America, teacher unions exercise considerable clout and have succeeded in securing lower class sizes, generating more teaching positions. Top-performing East Asian countries, on the other hand, tend to have weaker unions and there are, consequently, fewer political costs involved in running larger class sizes or in implementing rigorous teacher evaluation systems.  Increases in education spending get invested in reducing class sizes, contrary to the PISA recommendation, and in the face of robust evidence that it does not improve student outcomes.

Conclusions

            Ideology, education governance and conflicts of interest all serve to undermine the overall effectiveness of evidence-based, PISA-informed policy prescriptions. Education authorities tend to be risk-averse when it comes to implementing targeted policy prescriptions and resisting the pressures to increase spending levels, driven by the core interests, most notably local education authorities and teacher unions.

Three key lessons jump out in the latest book on PISA. First, decreases in class size and increases in teacher salaries do not work in improving student achievement but such policy recommendations run headlong into the vested interests of unions and the preference of active parents alert to any diminution in the amount of resources received from public funds. Secondly, some influential factors are “strongly context-dependent” (such as school autonomy and site-based management) and are difficult for policymakers to interpret. In such cases, applying policies universally can yield dire consequences. Finally, attempts to measure equity, including those of PISA analysts, tend to be inconclusive and partial, leading to recommendations more often than not “heavily influenced by ideology.”  This has led to a universal recommendation to apply comprehensive policies and avoid those that are regarded as ‘discriminatory’ (such as ability grouping and early tracking). Such policies lead to the worst outcomes in terms of equity among more stratified societies.

Pointing fingers and apportioning blame has become all-too-common in OECD’s highly influential PISA reports.  What’s clear from the latest critique, levelled by the two former PISA insiders, is that flatlined student outcomes and policy shortcomings have much to do with PISA’s implicit biases (ideology), structural impediments (union advocacy), and conflicts of interest (service provider capture). That is why, according to the critics, PISA is failing in its mission.

Judging from the latest book, PISA has made little difference in improving school systems.  Is PISA failing in its mission? With so much evidence from student testing, why do education systems tend to brush aside the runaway success of top-performing Asian countries and, perhaps most importantly, why do so many systems continue to struggle?

 

Read Full Post »