Feeds:
Posts
Comments

Archive for the ‘PISA Test Rankings’ Category

 

The Program of International Student Assessment (PISA), managed by Andreas Schleicher and the OECD Education Skills Office in Paris, France, is still regarded as the “gold standard” in comparative student assessment and enjoys a rather charmed life. Every three years, educational leaders, commentators, and engaged teachers eagerly await the results of student testing and its so called ‘league tables’ ranking the performance of 15-year-olds from some 79 participating jurisdictions. A new book, Dire Straits: Education Reforms, Ideology, Vested Interests and Evidence, produced by two Spanish researchers, Montserrat Gomendio and José Ignacio Wert, is sure to rock that edifice and punch holes in the credibility of the OECD’s education branch.

Student assessment and accountability are essential and yet elusive in global K-12 education, both within countries and internationally, so school reformers put faith in ILSAs like PISA to provide solid evidence on how students were actually performing in the core skills of reading, mathematics and science. Across the globe, educational leaders and policy-makers looked to PISA to provide evidence and guidance to allow us to form a consensus on what works in different countries, and particularly on what can be learned from student achievement gains in top-performing nations. That has not happened according to one of the book’s authors, Montserrat Gomendio, OECD’s former deputy director for education and head of its Centre for Skills. It’s all spelled out in a devasting critique in the current Winter 2023 edition of Education Next.

PISA is OECD Education’s crown jewel in an organization dedicated to providing reliable data and policy advice, encouraging comparative analysis and learning exchanges worldwide.  From the first cycle of PISA (2000) to the last (2018), the number of participating countries increased from a rather homogeneous group of 32 OECD countries to some 79, owing largely to the addition of many low- and middle-income countries. Flush with its own success, the OECD made a boastful claim: “PISA has become the world’s premier yardstick for evaluating the quality, equity and efficiency of school systems, and an influential force for education reform.”

PISA’s own data tells the tale. “After almost two decades of testing, student outcomes have not improved overall in OECD nations or most other participating countries,” according to Gomendio. She recognizes that, up until 2018, a global recession, the rise of social media, and environmental disasters did present “headwinds for school-improvement efforts.” Failing to achieve its mission, she points out, led to “blame games.” That was precipitated by the dawning realization that student outcomes had flatlined from 2000 to 2018. In response, OECD Education officials pointed fingers at its own member states for not taking advantage of the PISA data and carrying out the recommended policy changes.

Policy recommendations from PISA are built upon two different approaches – quantitative analyses of student outcomes and a range of features of education systems and qualitative analyses of low- and top-performing countries. It is commonly agreed PISA’s quantitative analyses of cross-sectional samples and correlations cannot be used to draw causal inferences. It’s qualitative analyses, particularly with regard to Nordic countries, also suffer from serious drawbacks such as cherry-picking. Other weaknesses, she points out in Education Next, have gone largely unnoticed.  One of the biggest question marks is the reliability of student results on such “low stakes” tests. In the case of Australia, for example, the National Council on Educational Research (NCER) found that a majority of Australian students (73%) may not have taken the PISA test seriously and would have invested more effort if it counted towards their marks.

Quality and Equity – Confronting the Contradictions

PISA seeks to measure two complementary dimensions of education systems: quality and equity. Measuring quality on the basis of average student test scores is far easier than assessing equity. To do so, PISA employs a multidimensional concept using metrics such as the relationship between socioeconomic status and student performance, the degree of differences in student performance within and between schools, and many others. None of these variables, Gomendio points out, “tell the full story” and “each of them leads to different conclusions.” So, ultimately PISA’s prism on equity is ultimately too narrow and somewhat unreliable.

PISA’s analysis of school choice and policy recommendations on that issue draw fire from Gomendio and Wert. Claims that students in private schools do not perform better that those in public schools (after correcting for socioeconomic status), are problematic. Analyses lumping private schools together with government-funded, privately managed charter schools skews the results. It also makes it impossible to disaggregate the data. That explains why PISA analyses are at odds with other international assessments, as well as research studies, which show that “school choice often does lead to better student outcomes without necessarily generating segregation.” In addition, the small number of countries with early tracking (streaming into academic and applied/vocational) show “little (if any) differences in student performance and employability rates for vocational-education students.” It is clear that PISA would benefit from thinking outside the box, paying attention to academic research and looking at the broader picture.

            The new book Dire Straits, written by two Spanish researchers, confronts squarely PISA’s implicit bias in favour of Finland and other Nordic countries. The authors are particularly critical of PISA’s analyses of Finland and Germany. In PISA’s first cycle, they call into question the lionizing of Finland for its “quality and equity” and labelling of Germany as a “heavily tracked system” that promoted inequity and “should be avoided.”  

Nordic societies like Finland do get a free ride with PISA because they were egalitarian long before the inception of PISA. Egalitarian societies like Finland possess natural advantages since teachers work with a more uniform student population and are better positioned to implement inclusive policies across the board to all students. More stratified societies in Europe and Latin America, for example, require more differentiated approaches to meet the needs of the full spectrum of students. More recognition should be accorded to stratified societies with income inequalities that tend to have bigger challenges closing the equity gap. In the case of Canada, for example, it is useful to examine how our country manages to maintain reasonable student achievement standards, while alleviating the equity gap, particularly in relation to the United States.

Identifying Exemplars, Applying the Right Lessons

PISA completely missed the boat on the rise of student outcomes in Singapore and its East Asian neighbours and the relative decline of Finland. A few decades ago, Singapore had an illiterate population and very few natural resources. The country made a decision to invest in human capital as the engine of economic growth and prosperity, and, in a few decades, it became the top performer in all international assessment programs. Part of that improvement can be attributed to implementing tracking in primary school in an effort to decrease its high dropout rate. Once this was achieved, the country delayed tracking until the end of primary school. So far, PISA has not provided a coherent and differentiated analysis of the “Singapore Miracle.”

Teacher quality is more salient than PISA recognizes in its analyses. In the case of Singapore and the East Asian countries only top-performing students can enter education-degree programs, whereas poorer performing Latin American countries tend to have teachers drawn from the weaker academic ranks. Professional recruitment programs are mostly weak and teacher evaluation mechanisms almost non-existent.  Teacher unions are not always helpful in improving the quality of instruction.  In the case of Latin America, teacher unions exercise considerable clout and have succeeded in securing lower class sizes, generating more teaching positions. Top-performing East Asian countries, on the other hand, tend to have weaker unions and there are, consequently, fewer political costs involved in running larger class sizes or in implementing rigorous teacher evaluation systems.  Increases in education spending get invested in reducing class sizes, contrary to the PISA recommendation, and in the face of robust evidence that it does not improve student outcomes.

Conclusions

            Ideology, education governance and conflicts of interest all serve to undermine the overall effectiveness of evidence-based, PISA-informed policy prescriptions. Education authorities tend to be risk-averse when it comes to implementing targeted policy prescriptions and resisting the pressures to increase spending levels, driven by the core interests, most notably local education authorities and teacher unions.

Three key lessons jump out in the latest book on PISA. First, decreases in class size and increases in teacher salaries do not work in improving student achievement but such policy recommendations run headlong into the vested interests of unions and the preference of active parents alert to any diminution in the amount of resources received from public funds. Secondly, some influential factors are “strongly context-dependent” (such as school autonomy and site-based management) and are difficult for policymakers to interpret. In such cases, applying policies universally can yield dire consequences. Finally, attempts to measure equity, including those of PISA analysts, tend to be inconclusive and partial, leading to recommendations more often than not “heavily influenced by ideology.”  This has led to a universal recommendation to apply comprehensive policies and avoid those that are regarded as ‘discriminatory’ (such as ability grouping and early tracking). Such policies lead to the worst outcomes in terms of equity among more stratified societies.

Pointing fingers and apportioning blame has become all-too-common in OECD’s highly influential PISA reports.  What’s clear from the latest critique, levelled by the two former PISA insiders, is that flatlined student outcomes and policy shortcomings have much to do with PISA’s implicit biases (ideology), structural impediments (union advocacy), and conflicts of interest (service provider capture). That is why, according to the critics, PISA is failing in its mission.

Judging from the latest book, PISA has made little difference in improving school systems.  Is PISA failing in its mission? With so much evidence from student testing, why do education systems tend to brush aside the runaway success of top-performing Asian countries and, perhaps most importantly, why do so many systems continue to struggle?

 

Read Full Post »

Ontario’s Mathematics program for Kindergarten to Grade 12 has just undergone a significant revision in the wake of the continuing decline in student performance in recent years. On June 24, 2020, Education Minister Stephen Lecce unveiled the new mathematics curriculum for elementary school students with a promised emphasis on the development of basic concepts and fundamental skills. In a seemingly contradictory move, the Minister also announced that the government was cancelling next year’s EQAO testing in Grades 3 and 6 to give students and teachers a chance to get used to the new curriculum.

While the Doug Ford Government was elected in June 2018 on a “Back to the Basics” education pledge, the new mathematics curriculum falls considerably short of that commitment. While the phrase “back to the basics” adorned the media release, the actual public message to parents and the public put more emphasis on providing children with practical skills. Financial literacy will be taught at every grade level and all students will learn coding or computer programming skills, starting in Grade 1 in Ontario schools. A more detailed analysis of the actual math curriculum changes reveals a few modest steps toward reaffirming fundamental computation skills, but all cast within a framework emphasizing the teaching of “social-emotional learning skills.” 

The prevailing “Discovery Math” philosophy enshrined in the 2005 Ontario curriculum may no longer be officially sanctioned, but it remains entrenched in current teaching practice. Simply issuing provincial curriculum mandates will not change that unless teachers themselves take ownership of the curriculum changes. Cutting the number of learning outcomes for Grades 1 to 8 down to 465 “expectations” of learning, some 150 fewer than back in 2005, will be welcomed, especially if it leads to greater mastery of fewer outcomes in the early grades.

The parents’ guide to the new math curriculum, released with the policy document, undercuts the “back to basics” commitment and tilts in a different direction. The most significant revamp is not the reintroduction of times tables, teaching fractions earlier on, or emphasizing the mastery of standard algorithms. It is the introduction of a completely new “strand” with the descriptor “social-emotional learning skills.” That new piece is supposedly designed to help students “develop confidence, cope with challenges, and think critically.” It also embodies the ‘discovery learning‘ approach of encouraging students to “use strategies” and “be resourceful” in “working through challenging problems.”

Ontario’s most influential mathematics curriculum consultants, bracing for the worst, were quick to seize upon the unexpected gift.  Assistant professor of math education at the Ontario Institute for Studies in Education (OISE), Mary Reid, widely known for supporting the 2005 curriculum philosophy, identified the “social-emotional learning” component as “critically important” because it would “help kids tremendously.” That reaction was to be expected because Reid’s research focuses on “math anxiety” and building student confidence through social-emotional learning skills development.

Long-time advocates for higher math standards such as Math teacher Barry Garelick and Ottawa parent Clive Packer saw the recommended approach echoing the prevailing ‘discovery math’ ideology.  Expecting to see a clear statement endorsing mastering the fundamentals and building confidence through enhanced competencies, they encountered documents guiding teachers, once again, toward “making math engaging, fun and interesting for kids.” The whole notion that today’s math teachers utilizing traditional methods stress “rote memorization” and teach kids to “follow procedure without understanding why” is completely bogus. Such caricatures essentially foreclose on serious discussion about what works in the math classroom.

How does the new Ontario math curriculum compare with the former 2005 curriculum?  Identifying a few key components allows us to spot the similarities and differences:

Structure and Content:

  • New curriculum: “clear connections show how math skills build from year to year,” consistent for English-language and french-language learners.
  • Former 2005 curriculum: Difficult to make connections from year-to-year, and inconsistencies in expectations for English-speaking and French-speaking learners.

Multiplication and division:

  • Grade 3, new curriculum: “recall and demonstrate multiplication facts of 2, 5, and 10, and related division facts.” In graduated steps, students learn multiplication facts, starting with 0 X 0 to 12 X 12 to “enhance problem solving and mental math.”
  • Grade 3, 2005 curriculum: “multiply to 7 x 7 and divide to 49 ÷ 7, using a variety of mental strategies (e.g., doubles, doubles plus another set, skip counting) No explicit requirement to teach multiplication tables.

Fractions:

  • Grade 1, new curriculum: “introduced to the idea of fractions, through the context of sharing things equally.”
  • Grade 1, 2005 curriculum: Vague reference – “introducing the concept of equality using only concrete materials.”

Measurement of angles:

  • Grade 6, new curriculum: “use a protractor to measure and construct angles up to 360°, and state the relationship between angles that are measured clockwise and those that are measured counterclockwise.”
  • Grade 6, 2005 curriculum: “measure and construct angles up to 180° using a protractor, and classify them as acute, right, obtuse, or straight angles.”

Graphing data:

  • Grade 8, new curriculum: “select from among a variety of graphs, including scatter plots, the type of graph best suited to represent various sets of data; display the data in the graphs with proper sources, titles, and labels, and appropriate scales; and justify their choice of graphs “
  • Grade 8, 2005 curriculum: “select an appropriate type of graph to represent a set of data, graph the data using technology, and justify the choice of graph”

Improvements in the 2020 Math curriculum are incremental at best likely insufficient to make a significant difference. Providing students with effective instruction in mathematics is, after all, what ultimately leads to confidence, motivation, engagement, and critical thinking. Starting with confidence-building exercises gets it all backwards. Elementary mathematics teachers will be guided, first, to developing social and emotional learning (SEL) skills:  (1) identify and manage emotions; (2) recognize sources of stress  and cope with challenges; (3) maintain positive motivation and perseverance; (4) build relationships and communicate effectively; (5) develop self-awareness and sense of identity; (6) think critically and creatively. Upon closer scrutiny these are generic skills which are not only problematic but also entirely unmeasurable.

The fundamental question raised by the new Ontario math curriculum reform is whether it is equal to the task of improving stagnating student test scores. Student results in English-language schools in Grade 3 and Grade 6 mathematics, on EQAO tests, slid consistently from 2012 to 2018. Back in 2012, 68 % of Grade 3 students met provincial standards; in 2018, the mean score dropped to 58 %.  In Grade 6 mathematics, it was worse, plummeting from 58 % to 48% meeting provincial standards. On international tests, Ontario’s Program of International Student Assessment (PISA) Math scores peaked in 2003 at 530 and dropped in 2013 to 509, then recovered slightly in 2018 to 514, consistent with the provincial slide (See Graph – Greg Ashman). Tinkering with math outcomes and clinging to ineffective “mathematical processes” will likely not be enough to change that trajectory.

Building self-esteem and investing resources in more social and emotional learning (SEL) is not enough to turn-around student math achievement. Yet reviewing the new mathematics curriculum, the Ontario curriculum designers seem to have lost their way. It all looks strangely disconnected from the supposed goal of the reform — to raise provincial math standards and improve student performance on provincial, national, and international assessments.

What’s the real purpose of the new Ontario mathematics curriculum reform?  Does the latest curriculum revision reflect the 2018 commitment to move forward with fundamentals or is it a thinly-disguised attempt to integrate social and emotional learning into the program?  Where is the evidence, in the proposed curriculum, that Ontario education authorities are laser focused on improving math standards? Will this latest reform make much of a difference for students looking for a bigger challenge or struggling in math? 

Read Full Post »

“All that glitters is not gold” is a famous proverb plucked from William Shakespeare‘s play The Merchant of Venice that may well apply to recent international appraisals of K-12 education in Canada. Such rosy assessments tend to put a shiny lustre on what is essentially a sound and ‘pretty good’ school system that has lost ground to competing nations over the past decade.

Five years ago, the Organization for Economic Cooperation and Development(OECD) produced a rather rosy Education Policy Outlook for Canada as part of a series of reports offering comparative analysis of education policies and reforms across the world’s developed countries. Canada’s overall performance, aggregated from widely varied provincial assessment data, looked good, in comparison with the United States, the United Kingdom, and Australia. Most significantly, the OECD assessors brushed aside concerns about “plateaued student achievement” on the Programme of International Student Assessment (PISA) tests and the decline in the proportion of top performing students.

Emerging concerns were most clearly expressed in Dr. Paul Cappon’s final 2010 report for the Canadian Council on Learning. Student scores on the 2009 PISA test had revealed that Canadian 15-year-olds demonstrated relatively strong sets of skills in reading, math and science, but they were already slipping relative to high performing Asian countries and in some cases in absolute terms. “What I’m hoping,” Cappon said at the outset of his final cross-Canada tour, “is that when people realize that Canada is slipping down the international learning curve we’re not going to be able to compete in the future unless we get our act together.”

OECD Education Policy Outlook assessments and Country reports are based upon templates that tend to favour diverse and well-funded school systems like that of Canada. The six identified policy levers in 2015 were: 1) equity and quality of education; 2) preparing students for the future; 3) school improvement; 4) evaluation and assessment; 5) governance; and 6) funding.  Such public policy forecasts, based upon conventional criteria and historic trends, also tend to demonstrate “path dependency” which limits the capacity to capture radical shifts in context or dynamic changes in educational direction.

Fifteen-year-old students in Canada, based upon triennial PISA tests from 2000 to 2018, continue to perform above the OECD average in reading, mathematics and science. Our most economically and socially disadvantaged students, in aggregate, do relatively better than those in competing countries, demonstrating more equity than in most other countries.  A considerably higher proportion of Canadian K-12 students proceed to post-secondary education in universities and colleges. That much has not changed across time.

Three significant changes can be identified from the accumulating OECD student assessment and survey data and they deserve far more critical scrutiny:

Downward Trend in Student Performance:  The performance trends for Canadian fifteen-year-olds are consistently downward from 2000 to 2018 in READING,  from 2003 to 2018 in MATHEMATICS, and from 2006 to 2018 in SCIENCE.  While the OECD average scores are also in decline as more countries are included in PISA, the descent is more pronounced among students from Canada. Students in Canada’s top performing provinces of Alberta, Ontario, British Columbia and Quebec (Mathematics) tend to buoy-up the lagging results produced by students from New Brunswick, Newfoundland/Labrador, Saskatchewan, and Manitoba.

Deteriorating Classroom Disciplinary Climate:

The 2015 Education Policy Outlook for Canada flagged one measure, based upon student survey responses, where Canada simply met the OECD standard – the index of classrooms conducive to learning (Figure 5, OECD Canada, 2015).  That largely undiagnosed problem has worsened over the past three years.  Canada ranked 60th out of 77 participating nations and educational districts in the OECD’s 2018 index of disciplinary climate, released on December 4, 2019.  According to a global student survey conducted in the spring of 2018, one in five students, 15 years-of-age, report that learning time is lost to noise, distractions, and disorder, so much so that it detracts from learning in class. A relatively high proportion of Canadian students say the teacher is not listened to and it takes a long time for the class to settle down. In addition, students regularly skip school and report late to class.

High Incidence of Fear of Failure:

Personal anxieties may also run higher among Canadian students when they confront writing standardized tests and experience a fear of failing the test. In Canada, the OECD 2019 Education GPS report states, “15-year-old students have a strong fear of failure”ranking 6th among 77 national student groups participating in the survey.  Fear of failure runs highest among students in Chinese Taipei, Singapore, Macau, Japan, and Germany, but is less pronounced in high performing countries such as Korea. Estonia, and Finland.  Such fears are present to the same degree among students in the United Kingdom, but less so in the United States.  No analysis whatsoever is offered to explain why fears run so comparatively high among teens in Canada.

The initial report on the Canadian Results of the OECD PISA 2018 Study, released by the Council of Ministers of Education (CMEC) in early December 2019, are of little help in evaluating these rather striking trends.  Like previous reports in the CMEC series, the report puts a positive spin on the aggregate results by casting them within a broad, global context, lumping together countries with radically different commitments to education in terms of spending and resources. It is possible to ferret out anomalies and to conduct province-by-province comparisons, but only with time, effort, and attention to detail. That is sufficient to keep it either buried or accessible only to education assessment specialists.

Does the Canadian Education Policy Outlook ventured in 2015 stand up under close analysis. five years on?  What’s missing from the OECD and CMEC assessment reports for Canada over the past decade?  Should the Canadian public be concerned about the downward trend in the demonstration of core skills in reading, mathematics and science?  Is disciplinary climate now a real concern in Canadian classrooms? And why are Canadian students so afraid of failing in our schools when grade promotion and graduation rates are at record levels?

Read Full Post »

Students and parents in the Pontus school in Lappeenranta, one of the first  Finnish schools to implement the “phenomenon-based” digital curriculum, are now disputing the broad claim made by the World Economic Forum in its 2018 Worldwide Educating for the Future Index. Concerned about the new direction, parents of the children have lodged a number of complaints over the “failure” of the new school and cited student concerns that they didn’t “learn anything” under the new curriculum and pedagogy. For some, the only recourse was to move their children to schools continuing to offer a more explicit teaching of content knowledge and skills.

The Finnish parent resistance is more than a small blip on the global education landscape. It strikes at the heart of the Finnish Ministry of Education’s 2016 plan to introduce “phenomenon’ problem-solving — replacing more traditional subject-based curriculum in mathematics, science, and history with an interdisciplinary model focusing on developing holistic skills for the future workplace. Perhaps more significantly, it blows a hole in the carefully-crafted image of Finland as the world leader in “building tomorrow’s global citizens.”

The basis for Finland’s claim to be a global future education leader now rests almost entirely upon that 2018 global ranking produced by the World Economic Forum, based upon advice gleaned from an ‘expert panel’ engaged by The Economist Intelligence Unit Limited.  While Finland has slipped from 2000 to 2015 on the more widely-recognized Program of International Student Assessment (PISA) rankings, that educational jurisdiction remains a favourite of global learning corporations and high technology business interests. A close-up look at who provides the “educational intelligence” to the World Economic Forum demonstrates the fusion of interests that sustains the global reputation of Finland and other Western nations heavily invested in digital technology and learning.

The 2018 World Economic Forum future education index was a rather polished attempt to overturn the prevailing research consensus.  The PISA Worldwide Ranking – based upon average student scores in math, reading and science — place Asian countries, Estonia and Canada all ahead of Finland in student achievement.  The top five performers are Singapore (551.7), Hong Kong (532.7), Japan (528.7), Macau (527.3), and Estonia (524.3). A panel of seventeen experts, selected by The Economist Intelligence Unit, sets out to dispute the concrete student results of an OECD study of 70 countries ranking 15-year-olds on their scholastic performance.

The Economist Intelligence Unit index runs completely counter to the PISA rankings and attempts to counter the well-founded claim that student mastery of content-knowledge and fundamental skills is the best predictor of future student success in university, college and the workplace. Upon close examination, the World Economic Forum index seeks to supplant the established competencies and to substitute a mostly subjective assessment of “the effectiveness of education systems in preparing students for the demands of work and life in a rapidly changing landscape”( p. 1). It focuses on the 15 to 24 year-old-age band in some 50 countries around the world. Setting aside how students are actually performing, we are provided with a ranking based almost exclusively on compliance with so-called “21st century learning” competencies – leadership, creativity, entrepreneurship, communication, global awareness, and civic education skills.

The poster child nation for the World Economic Forum rankings is Finland, now ranked 8th on its PISA scores, because it has now embraced, full-on, the “21st century learning” ideology and invests heavily in technology-driven digital education. The balance of the Top 5 World Economic Forum nations, Switzerland, New Zealand, Sweden, and Canada, rank 15th, 16th, 26th, and 5th on the basis of their students’ PISA scores. Most problematic of all, the future education ranking downgrades the current global education leaders, Singapore (7th), Japan (12th), and Hong Kong (15th).  Mastery of academic competencies is, based upon the assessment criteria, not relevant when you are ranking countries on the basis of their support for technology-driven, digital education.

Who produced the World Economic Forum rankings?  The actual report was written by Economist Intelligence Unit contract writer Denis McCauley, a veteran London-based global technology consultant, known for co-authoring, a Ricoh-sponsored white paper, Agent of Change, alerting business leaders to the urgent necessity of embracing Artificial Intelligence and technological change.  Scanning the seventeen-member expert panel, it’s dominated by the usual suspects, global technology researchers and digital education proponents. One of the more notable advisors was Chief Education Evangelist for Google, Jaime Casap, the American technology promoter who spearheaded Google’s Apps for Education growth strategy aimed at teachers and powered by online communities known as Google Educator Groups, and “leadership symposiums” sponsored by the global tech giant.

Most of The Economist Intelligence Unit advisors see Finland as the ‘lighthouse nation’ for the coming technological change in K-12 education. Heavily influenced by former Finnish education ambassador, Pasi Sahlberg, they are enamoured with the Finnish model of phenomenon-based learning and its promise to implant “21st century skills” through structural changes in curriculum organization and delivery in schools.  It’s not surprising that it was actually Sahlberg who first tweeted about the Pontus school uprising, likely to alert Finnish education officials to the potential for broader resistance.

Launched in 2016 with a flurry of favourable ed-tech friendly research, the Finnish curriculum reform tapped into the rather obscure academic field of phenomenology.  The new curriculum adopted a phenomenon-based approach embracing curriculum integration with a theoretical grounding in constructivism. All of this was purportedly designed to develop student skills for the changing 21st century workplace. The ultimate goal was also spelled out by Canadian education professor Louis Volante and his associates in a World Economic Forum-sponsored April 2019 commentary extolling “broader measures” of assessing success in education. Peeling away the sugary coating, “phenomenon learning” was just another formulation of student-centred, project-based, 21st century skills education.

The daily reality for students like grade 6 student Aino Pilronen of Pontus School was quite different. “The beginning of the day was chaotic,” she reported, as students milled around developing study plans or hung-out in the so-called “market square.” “It was hard for me that the teacher did not teach at first, but instead we should have been able to learn things by ourselves.” Her brutally honest assessment: ” I didn’t learn anything.”

The Economist Intelligence Unit not only ignored such concerns voiced by students and parents, but brushed aside evidence that it would not work for the full range of students. A Helsinki University researcher, Aino Saarinen, attributed the decline in Finland’s PISA education standing to the increasing use of digital learning materials. Investing 50 million Euros since 2016 in training teachers to use digital devices and laptops, she claimed was not paying-off because “the more that digital tools were used in lessons, the worse learning outcomes were” in math, science, and reading. The most adversely affected were struggling and learning-challenged students, the very ones supposedly better served under the new curriculum.

What can we learn from taking a more critical, independent look at the actual state of Finnish education?  If Finnish education is in decline and 21st century learning reform encountering parental dissent, how can it be the top ranked “future education” system?  Who is providing the educational intelligence to the World Economic Forum?  Is it wise to accept a global ranking that discounts or dismisses quantitative evidence on trends in comparative student academic achievement? 

Read Full Post »

The Homework Debate never seems to go away.  Popular books and articles inspired initially by American education writer Alfie Kohn and his Canadian disciples continue to beat the drum for easing the homework burden on students or eliminating homework altogether before the secondary school level. That “No Homework” movement made significant inroads in the United States and Canada during the 2000’s. The Organization for Economic Cooperation and Development (OECD), responsible for the Program of International Assessment (PISA) test, confirmed that the amount of time students in North America spend on doing homework had declined, as of the 2014 assessment year.

HomeworkHackItHomeworkCaseAgainst2006

 

A critical question needs to be asked: Has the “No Homework” movement and the apparent push-back against homework had an adverse effect on student achievement? That’s difficult to answer because, despite the critical importance of the issue and the long history of homework research, few North American researchers have been inclined to study the role that homework plays in enhancing student achievement, even in mathematics.

One little-known researcher, Lake B. Yeworiew, an Ethiopian scholar, based at the University of Calgary, but recently-arrived in Canada, saw the hole in the research and recently tackled the whole question. His focus was on assessing the relationship between homework and Grade 8 Mathematics student achievement, comparing Canadian students with the top performing students in the world. While attending the AERA 2019 Congress (April 5-9) in Toronto, I got a sneak peak at his findings.  While his research study attracted little attention, it will be of considerable interest to all of those committed to maintaining and improving student performance standards.

LakeYoworiew

His University of Calgary study, co-authored with Man-Wai Chu and Yue Xu, laid out the essential facts: The average performance of Canadian students in Mathematics (PISA) has declined since 2006 (OECD, 2007, 2010, 2014, 2016)  Students from three top performing Asian countries, Singapore, Macau-China and Japan, continue to outperform our 15-year-old students by a significant margin.  Furthermore, OECD reports that students in Asian countries (Singapore, Japan, Macao- China and Hong Kong-China) spend more time doing homework and score much higher. It is estimated that they score 17 points or more per extra hour of homework.

Recent North American research seems more alert to the need to study the relationship between homework and academic achievement, particularly in mathematics. A literature review, conducted by Yeworiew, Chu and Xu, demonstrates that, while the findings cut in both directions, the weight of research favours homework. In fact, the Canadian Council of Ministers’ of Education (CMEC 2014) has come down in favour of homework. Based upon Canadian national test surveys (PCAP), CMEC confirms that math achievement of students who do not do homework is significantly lower than those doing regular homework.

Yeworiew and his research team provide further confirmation of this 2014 CMEC assessment. Utilizing the 2015 TIMSS study in Canada, involving 8,757 students and 276 schools in four provinces (Ontario, Quebec, Manitoba and Newfoundland/Labrador), the authors demonstrate the clear value of regular homework in modest amounts.

The research findings are effectively presented in a series of graphs mapping the study results, reprinted here directly from their AERA 2019 Toronto presentation:

 

 

The relationship between homework and achievement is becoming less of a mystery. Based upon the performance of Grade 8 students in the 2015 TIMSS study, short but frequent homework assignments contribute to improved student learning and achievement in mathematics. Frequent homework assignments, up to four times a week, have a positive effect on math achievement, but less sop when it is of longer duration. No discernable differences were detected for girls in relation to boys at the Grade 8 level in Canada.

Why do Canadian researchers produce so few studies like the University of Calgary project attempting to assess the impact of homework on achievement?  To what extent is it because Canadian homework studies tend to focus on psycho-social aspects such as the impact of homework on student attitudes and the opinions of parents?

Are we asking the right questions? “How much is enough?” is surely a sounder line of inquiry than “How do you feel when overburdened with homework? ” What is really accomplished by asking ‘Does homework ad to your anxieties?” Should we be more conscious of the inherent biases in such research questions? 

 

 

 

 

 

 

 

 

Read Full Post »

Student achievement varies a great deal across the Organization of Economic Cooperation and Development (OECD) countries. Good teachers can have a significant impact upon their students’ learning and achievement and there is now research to support that contention.  What makes some teachers more effective than others is less clear.  It remains one question that cries out for further in-depth study.

A comprehensive research study reported in the latest issue of Education Next (Vol. 19, Spring 2019) tackles that fundamental question on an international comparative scale. Three American researchers, Eric A Hanushek, Marc Piopiunik, and Simon Wiederhold, not only demonstrate that teachers’ cognitive skills vary widely among developed nations, but that such differences matter greatly for student performance in school.

Developing, recruiting and training a teacher force with higher cognitive skills (Hanushek, Piopiunik, Wiederhold 2019) can be critical in improving student learning. “An increase of one standard deviation in teacher cognitive skills,” they claim, “is associated with an increase of 10 to 15 per cent of a standard deviation in student performance.” Comparing reading and math scores in 31 OECD countries, teachers in Finland come out with the highest cognitive skills. One quarter of the gaps in average student performance across countries would be closed if each of them were to raise the level of teachers’ cognitive skills to that of Finland.

What’s most fascinating about this study is the large role Canadian teachers play in the comparative data analysis for teacher cognitive skills.  Of the 6,402 teacher test-takers in 31 countries, the largest group, 834 (13 per cent), were from Canada. Based upon data gleaned from the OECD Program for the International Assessment of Adult Competencies (PIAAC), we now know where Canadian teachers rank in terms of their numeracy and literacy skills (See Figure 1). We also have a clearer indication of how Canadians with Bachelor’s degrees and Master’s or Doctoral degrees rate in terms of their core cognitive skills.

Teachers from Canada fare reasonably well, in the top third, in the comparative analysis of cognitive skills. In literacy, teachers in Canada perform above average, with a median score of 308 points out of 500 compared to the sample-wide average of 295 points.  If there’s a problem, it’s in terms of numeracy skills, where they perform slightly above the teacher-wide sample with a median score of 293, compared to the average of 292 points. Adult Canadians with Bachelor’s degrees actually outperform teachers in numeracy skills by 7 points. Teachers in Finland and Japan, for example, perform better than Canadians with Master’s or Doctoral degrees.

Since the September 2010 appearance of  the McKInsey & Company study “Closing the talent gap,,” American policy-makers have considered teachers’ own academic performance as “a key predictor” of higher student achievement, based upon teacher recruitment practices in countries that perform well on international tests. High scoring countries like Singapore, Finland and Korea, for example, recruit their teacher force from the top third of their academic cohorts in university.

Securing sound data on the actual quality of recent Canadian teacher education cohorts is challenging because of the paucity of reported information. One claim that Canadian teachers come from the “top one third of high school graduates” put forward in a 2010 McKinsey & Company OECD study looks highly suspect.

A September 2008 review of Initial Teacher Education Programs  (Gambhir, Evans, Broad, Gaskell 2008), reported that admission cut-offs ranged from 65 per cent to over 90 per cent, depending upon the faculty of education. Most of the Canadian universities with Faculty of Education programs, to cite another fact, still have grade cut-off averages for acceptance in the Arts and Science that hover between 70 per cent and 75 per cent. With the exception of OISE, Western, Queen’s and UBC, teacher candidates are not drawn from the top third of their academic cohort, particularly in mathematics and sciences.

Differences in teachers’ cognitive skills within a country also seem to have a bearing upon student performance. Plotting student performance difference between math and reading ( at the country level) against the difference in teacher cognitive skills between numeracy and literacy yields some intriguing results (Figure 2). An increase of teacher cognitive skills of one standard deviation is estimated to improve student achievement by 11 per cent of standard deviation. The data for Canada shows a teacher test-score difference between numeracy and literacy of -12 points

The brand new American study (Hanushek, Piopiunik, Wiederhold 2019) also demonstrates that paying teachers better is a possible factor in attracting and retaining teachers with higher cognitive skills. In terms of wage premiums, teachers’ earnings in higher performing countries are generally higher, as borne out by Ireland, Germany and Korea, where teachers earn 30 to 45 per cent more than comparable college graduates in other jobs.

Teachers in Canada earn 17 per cent more than their comparators, while those in the USA and Sweden earn 22 per cent less. Increasing teacher pay has potential value in the United States where salaries discourage the ‘best and brightest’ from entering teaching. There is a caveat, noted by Hanushek and his research team:  Changes in policy must ensure that “higher salaries go to more effective teachers.”

Do smarter teachers make for smarter students? How sound is the evidence that teachers who know more are actually better teachers? Why do we put so much stock in improving student learning in literacy/reading and mathematics?  What potential flaws can you spot in this type of research? 

 

Read Full Post »

“Canadians can be proud of our showing in the 2015 Programme for International Student Assessment (PISA) report,” declared Science consultant Bonnie Schmidt and former Council of Ministers of Education (CMEC) director Andrew Parkin in their first-off-the mark December 6, 2016 response to the results. “We are, ” they added, “one of only a handful of countries that places in the top tier of the Oganization for Economic Development and Cooperation (OECD) in each of the three subjects tested:science, reading and math.”

pisa2015cmeccover“Canada” and “Canadian students,” we were told, were once again riding high in the once-every-three-years international test sweepstakes. If that that effusively positive response had a familiar ring, it was because it followed the official line advanced by a markedly similar CMEC media release, issued a few hours before the commentary.

Since our students, all students in each of our ten provincial school systems, were “excelling,” then it was time for a little national back-slapping. There’s one problem with that blanket analysis: it serves to maintain the status quo, engender complacency, obscure the critical Mathematics scores, and disguise the lopsided nature of student performance from region to region.

Hold on, not so fast, CMEC — the devil is in the real details and more clearly portrayed in the OECD’s own “Country Profile” for Canada. Yes, 15-year-olds in three Canadian provinces (Alberta, British Columbia, and Quebec) achieved some excellent results, but overall Mathematics scores were down, and students in over half of our provinces trailed-off into mediocrity in terms of performance. Our real success was not in performance, but rather in reducing the achievement gap adversely affecting disadvantaged students.

Over half a million 15-year-olds in more than 72 jurisdictions all over the world completed PISA tests, and Schmidt and Parkin were not alone in making sweeping pronouncements about why Canada and other countries are up and others down in the global rankings.

Talking in aggregate terms about the PISA performance of 20,000 Canadian students in ten different provinces can be, and is, misleading, when the performance results in mathematics continue to lag, Ontario students continue to underperform, and students in two provinces, Manitoba and Saskatchewan, struggle in science, reading, and mathematics.  Explaining all that away is what breeds complacency in the school system.

My own PISA 2015 forecast was way off-base — and taught me a lesson.  After the recent TIMSS 2015 Mathematics results released in November 2016, an  East Asian sweep, led by Singapore and Korea, seemed like a safe bet. How Finland performs also attracts far less attention than it did in its halcyon days back in 2003 and 2006. The significant OECD pivot away from excellence to equity caught me napping and I completely missed the significance of moving (2012 to 2015) from pencil-and-paper to computer-based tests. 

Some solace can be found in the erroneous forcecasts of others. The  recent Alberta Teachers’ Association (ATA) “Brace Yourself” memo with its critique of standardized testing assessment, seemed to forecast a calamitous drop in Alberta student performance levels. It only happened in Mathematics.

Advocates of the ‘Well-Being’ curriculum and broader assessment measures, championed by Toronto’s People for Education, will likely be temporarily thrown off-stride by the OECD’s new-found commitment to assessing equity in education. It will be harder now to paint PISA as evil and to discredit PISA results based upon such a narrow range of skills in reading, math and science.

The OECD’s “Country Profile” of Canada is worth studying carefully because it aggregates data from 2003 to 2015, clarifies the trends, and shows how Canadian students continue to struggle in mathematics far more than in reading and science.

Canadian students may have finished 12th in Mathematics with a 516 aggregate score, but the trend line continues to be in decline, down from 532 in 2003. Digging deeper, we see that students in only two provinces, Quebec ( 544) and BC (522) actually exceeded the national mean score. Canada’s former leader in Mathematics performance, Alberta, continued its downward spiral from the lofty heights of 549 (2003) to 511 (2015).

Since Ontario students’ provincial mathematics scores are declining, experts will be pouring over the latest PISA results to see how bad it is in relation to the world’s top performing systems. No surprises here: Ontario students scored 509, finishing 4th in Canada, and down from 530 on PISA 2003. Excellence will require a significant change in direction.

The biggest discovery in post-2015 PISA analysis was the positive link between explicit instruction and higher achievement in the 2015 core assessment subject, science. The most important factor linked with high performance remains SES (soci0-economic status), but teacher-guided instruction was weighted close behind and students taught with minimal direction, in inquiry or project-based classes, simply performed less well on the global test.

The results of the 15-year-olds are largely determined over 10 years of schooling, and not necessarily the direct consequence of the latest curriculum fad such as “discovery math.’’

It’s better to look deeper into what this cohort of students were learning when they first entered the school system, in the mid-1990s. In the case of Canadian students, for example, student-centred learning was at its height, and the country was just awakening to the value of testing to determine what students were actually learning in class.

Where the student results are outstanding, such as Singapore and Estonia, it is not solely attributable to the excellence of teaching or the rigour of the math and science curriculum.

We know from the “tutoring explosion” in Canada’s major cities that the prevalence of private tuition classes after school is a contributing factor, and may explain the current advantage still enjoyed in mathematics by Pacific Rim students.

Children of Chinese heritage in Australia actually outperformed students in Shanghai on the 2012 PISA test, and we need to explore whether that may be true for their counterparts in Greater Vancouver. The so-called “Shanghai Effect” may be attributed as much to “tiger mothers” as it is to the quality of classroom instruction.

Whether Canada and Canadians continue to exhibit high PISA self-esteem or have simply plateaued does not matter as much as what we glean over the next few years from studying best international practice in teaching, learning, and assessment.

Surveying PISA student results, this much is clear: standing still is not an option in view of the profound changes that are taking place in life, work, and society.

 

Read Full Post »

With the release of the 2015 Program for International Student Assessment (PISA) on the horizon,  the Organization for Economic Cooperation and Development (OECD) Education Office has stoked-up the “Math Wars” with a new study. While the October 2016 report examines a number of key questions related to teaching Mathematics, OECD Education chose to highlight its findings on “memorization,” presumably to dispel perceptions about “classroom drill” and its use in various countries.

mathsubtractionboardThe OECD, which administers the PISA assessments every three years to 15-year-olds from around the globe, periodically publishes reports looking at slices of the data. It’s most October 2016 report,  Ten Questions for Mathematics Teachers and How PISA Can Help Answer Them, based upon the most recent 2012 results, tends to zero-in on “memorization” and attempts to show that high-performing territories, like Shanghai-China, Korea, and Chinese-Taipei, rely less on memory work than lower-performing places like Ireland, the UK, and Australia.

American Mathematics educator Jo Boaler, renowned for “Creative Math,” jumped upon the PISA Study to buttress her case  against “memorization” in elementary classrooms. In a highly contentious November 2016 Scientific American article, Boaler and co-author Pablo Zoido, contended that PISA findings confirmed that “memorizers turned out to be the lowest achievers, and countries with high numbers of them—the U.S. was in the top third—also had the highest proportion of teens doing poorly on the PISA math assessment.” Students who relied on memorization, they further argued, were “approximately half a year behind students who used relational and self-monitoring strategies” such as those in Japan and France. 

Australian education researcher Greg Ashman took a closer look at the PISA Study and called into question such hasty interpretations of the findings.  Figure 1.2: How teachers teach and students learn caught his eye and he went to work interrogating the survey responses on “memorization” and the axes used to present the data.  The PISA analysis, he discovered, also did not include an assessment of how teaching methods might be correlated with PISA scores in Mathematics.  Manitoba Mathematics professor Robert Craigen spotted a giant hole in the PISA analysis and noted that the “memorization” data related to “at-home strategies of students” not their instructional experiences and may wel;l indicate that students who are improperly instructed in class resort to memorization on their own.

mathpisateacherdirectedgraphWhat would it look like, Ashman wondered, if the PISA report had plotted how students performed in relation to the preferred methods used on the continuum from “more student-oriented instruction” to “more teacher-directed instruction.” Breaking down all the data, he generated a new graph that actually showed how teaching method correlated with higher math performance and found a “positive correlation” between teacher-directed instruction and higher Math scores. “Correlations,” he duly noted, “do not necessarily imply causal relationships but clearly a higher ratio of teacher-directed activity to student orientation.”

Jumping on the latest research to seek justification for her own “meta-beliefs” are normal practice for Boaler and her “Discovery Math” education disciples. After junking, once again, the ‘strawmen’ of traditional Mathematics — “rote memorization” and “drill,” Boaler and Zoido wax philosophical and poetic: “If American classrooms begin to present the subject as one of open, visual, creative inquiry, accompanied by growth-mindset messages, more students will engage with math’s real beauty. PISA scores would rise, and, more important, our society could better tap the unlimited mathematical potential of our children.” That’s definitely stretching the evidence far beyond the breaking point.

The “Math Wars” do generate what University of Virginia psychologist Daniel T. Willingham has aptly described as “a fair amount of caricature.” The recent Boaler-Zoido Scientific American article is a prime example of that tendency. Most serious scholars of cognition tend to support the common ground position that learning mathematics requires three distinct types of knowledge: factual, procedural and conceptual. “Factual knowledge,” Willingham points out, “includes having already in memory the answers to a small set of problems of addition, subtraction, multiplication, and division.” While some students can learn Mathematics through invented strategies, it cannot be relied upon for all children. On the other hand, knowledge of procedures is no guarantee of conceptual understanding, particularly when it comes to complexites such as dividing fractions. It’s clear to most sensible observers that knowing math facts, procedures and concepts is  what counts when it comes to mastering mathematics.

mathtimestableimageSimply ignoring research that contradicts your ‘meta-beliefs’ is common on the Math Education battlefield. Recent academic research on “memorization” that contradicts Boaler and her entourage, is simply ignored, even that emanating from her own university. Two years ago, Shaozheng Qin and Vinod Menon of Stanford University Medical School led a team that provided scientifically-validated evidence that “rote memorization” plays a critical role in building capacity to solve complex calculations.

Based upon a clinical study of 68 children, aged 7 to 9, studied over the course of one year, their 2014 Nature Neuroscience study, Qin, Menon et al. found that memorizing the answers to simple math problems, such as basic addition or multiplication, forms a key step in a child’s cognitive development, helping bridge the gap between counting on fingers and tackling more complex calculations. Memorizing the basics, they concluded, is the gateway to activating the “hippocampus,” a key brain structure for memory, which gradually expands in “overlapping waves” to accommodate the greater demands of more complex math.

The whole debate over memorization is suspect because of the imprecision in the use of the term. Practice, drilling, and memorization are not the same, even though they get conflated in Jo Boaler’s work and in much of the current Mathematics Education literature. Back in July 2012, D.T. Willingham made this crucial point and provided some valuable points of distinction. “Practice,” as defined by Anders Ericsson, involves performing tasks and feedback on that performance, executed for the purpose of improvement. “Drilling’ connotes repetition for the purpose of achieving automaticity, which – at its worst, amounts to mindless repetition or parroting. “Memorization,” on the other hand, relates to the goal of something ending up in long-term memory with ready access, but does not imply using any particular method to achieve that goal.

Memorization has become a dirty word in teaching and learning laden with so much baggage to the point where it conjures up mental pictures of “drill and kill” in the classroom. The 2016 PISA Study appears to perpetuate such stereotyping and, worst of all, completely misses the “positive correlation” between teacher-directed or explicit instruction and better performance in mathematics.

Why does the PISA Study tend to associate memorization in home-study settings with the drudgery of drill in the classroom?  To what extent does the PISA Study on Mathematics Teaching support the claims made by Jo Boaler and her ‘Discovery Math’ advocates? When it comes to assessing the most effective teaching methods, why did the PISA researchers essentially take a pass? 

 

Read Full Post »

The Chinese city of Shanghai has a school system that produces students who soar far above the rest.  On the 2009 and 2012 Programme of International Student Achievement (PISA) tests, administered to 15-year-olds worldwide, Shanghai-China students ranked first in all three major domains –mathematics, reading and science.

ShanghaiTeacherUntil recently, the secret of that astounding student achievement success was essentially shrouded in mystery.  With the release of the May 17 World Bank study, “How Shanghai Does It,” the answers are beginning to emerge, providing vitally-important lessons for education policy-makers in Canadian school systems and far beyond.

The World Bank report on Shanghai education, issued by World Bank research director Harry Patrinos, provides a counterpoint to the prevailing narrative that North American school systems should look to Finland for lessons on school improvement. It demonstrates, in incredible detail, what lies behind Shanghai-China’s rise to ‘education super-nova.’

The report, based upon SABER, a comprehensive World Bank system for benchmarking school system performance, delves deeply into how and why Shanghai students achieve excellent learning results. In the process, it smashes a few stubborn stereotypes and dispels the image of a mechanistic, test-driven, joyless educational enterprise.

Shanghai’s student successes stem, according to the World Bank, from a focus on teaching excellence. What’s unique about Shanghai-China is the way it “grooms, supports, and manages” teachers to raise educational quality and to a culture which accords great respect to the “teaching profession.”

We know that Shanghai students break records for extraordinary test scores, but lesser known is the success achieved in raising the floor for overall student achievement. The city has the highest share of disadvantaged students in the top 25 per cent range on PISA tests, and that is no accident. Educational equity is becoming a higher priority, especially targeting children of migrants.

Teachers in Shanghai are, by all accounts, well-trained and mentored after they become licensed to teach in schools. Ongoing professional development is not only offered, as in Canada, but integrated into a “collegial and supportive” professional growth process.  Subject mastery and pedagogical training go together in developing skilled and accomplished teachers.

Teaching time is organized far differently than in Canadian schools.  The Chinese teachers spend only one-third of their time actually teaching and far more emphasis is placed on preparation of demonstration lessons. Teaching effectiveness is the clear priority, not scattered efforts spread across a range of classes.

Teaching is also rewarded far differently.  Instead of being paid on a lock-step grid based upon seniority, Shanghai teachers move up the ladder based upon merit and guided by principals who are trained as instructional leaders not building administrators.

The biggest surprise is how Shanghai’s school system works to reduce educational inequalities. While education funding is vested in the school district, a proportion of the ‘education tax’ is specifically allocated to poor and low performing school districts.

ShanghaiSchoolBBCOne educational innovation worth emulating is what is known as the “entrusted school” management model to help raise up underperforming schools.  High-performing Shanghai schools are “twinned” with struggling schools within the state system. Instead of establishing private schools or creating charters, the Chinese use “twinning” to extend management, training, and resource support to teachers and students in the struggling schools.

Since 2006, the world of education has been enraptured with the so-called “Finnish Miracle,” while Shanghai-China has surged far ahead in student achievement. Instead of hitching our school improvement wagon to Finnish education promoter extraordinaire Pasi Sahlberg and his Finnish lessons, we should be looking at best practice anywhere and everywhere.

Let’s start by finding out where exactly we rank and what might be the areas that need improvement.  We generate lots of national, provincial and international student performance data, so why not put it to better use?

A really bold initiative would be to invite the World Bank to assess one Canadian provincial school system in relation to the SABER benchmarks.  The State of Maryland in the United States has already done so, and the SABER report for Maryland demonstrates just how incredibly valuable it can be in planning for, and advancing, school improvement.

The Finnish Education Miracle has begun to lose its lustre. Perhaps it’s time to consider edutourism junkets to Shanghai instead of Helsinki – in search of educational excellence as well as innovative teaching-learning ideas.

*An earlier version of this Commentary appeared in The Telegraph-Journal, provincial edition, based in Saint John, NB.

Will the World Bank report on Shanghai’s Educational Success be a wake-up call for North American educational leaders? Do popular stereotypes about Chinese education obscure our vision of Shanghai’s remarkable student performance achievements? Should we be producing more detailed studies of “Shanghai Lessons” for educators? And which Canadian province will be the first to follow Maryland in stepping-up to participate in the SABER assessment of school system effectiveness? 

 

Read Full Post »

A lively national conversation is underway in the United States over stalled upward mobility and stark income inequality and it has a more genteel echo in Canada.  Many North American educators point to poverty as the explanation for American students’ mediocre test scores and it also serves as a favoured rationale for explaining away the wide variations in achievement levels among and within Canadian provinces. Only recently have policy analysts, boring down into the PISA 2012 Mathematics data, begun to look at the alarming achievement gap between states and provinces, the relationship between education expenditures and performance levels, and the bunching of students in the mid-range of achievement.

PISA2012CanadaGraphicThe socio-economic determinists offer a simple-minded, mono-causal explanation for chronic student under-performance. American education policy analyst Michael Petrilli and Brandon Wright of The Thomas B. Fordham Institute recently recapped the standard lines: If teachers in struggling U.S. schools taught in Finland, says Finnish educator Pasi Sahlberg, they would flourish—in part because of “support from homes unchallenged by poverty.” Michael Rebell and Jessica Wolff at Columbia University’s Teachers College argue that middling test scores reflect a “poverty crisis” in the United States, not an “education crisis.” Adding union muscle to the argument, American Federation of Teachers president Randi Weingarten calls poverty “the elephant in the room” that accounts for poor student performance.

The best data we have to tackle the critical questions comes from the OECD Program for International Student Assessment (PISA), which just released its annual Education at a Glance 2015 report.  For its own analyses, PISA uses an index of economic, social, and cultural status (ESCS) that considers parental occupation and education, family wealth, home educational resources, and family possessions related to “classical” culture. PISA analysts use the index to stratify each country’s student population into quartiles. That broadens the focus so it’s not just about addressing the under-performance of disadvantaged children.

MathScoresSES2012The PISA socio-economic analysis identifies the key variations among international educational jurisdictions. Countries like Belgium and France are relatively better at teaching their higher-status students, while other countries like Canada and Finland do relatively better at instructing students from lower-status families. Contrary to past assumptions, the United States falls almost exactly on the regression line. It does equally well (or equally poorly, if you prefer) at teaching the least well-off as those coming from families in the top quartile of the ESCS index.

A Fall 2014 Education Next report by Eric Hanushek, Paul Peterson and Ludger Woessmann pointed out the wide variations, country-to-country, in overall Mathematics proficiency.   Some 35 percent of the members of the U.S. class of 2015 (NAEP) reach or exceed the proficiency level in math. Based on their calculations, this percentage places the United States at the 27th rank among the 34 OECD countries. That ranking is somewhat lower for students from advantaged backgrounds (28th) than for those from disadvantaged ones (20th).

Overall assessments of Mathematics proficiency on PISA offer no real surprises. Compared to the U.S., the percentage of students who are math proficient is nearly twice as large in Korea (65%), Japan (59%), and Switzerland (57%). The United States also lags behind Finland (52%), Canada (51%), Germany (50%), Australia (45%), France (42%), and the United Kingdom (41%). Within the U.S., the range is phenomenal – from a high of 51% in Massachusetts to a low of 19 % in Mississippi.

Cross-national comparisons are misleading, because Canadian students have plateaued on the PISA tests over the past decade.  While Canada was still among the high-level achievers, performance of the country’s 15-year-olds in mathematics has declined, with a 14-point dip in the past nine years. While performance in reading has remained relatively stable, the decline in science performance was “statistically significant,” dipping from an average of 534 in 2006 and 529 in 2009.

MathPISA2012RangesMuch like the United States, Canada exhibits significant variations from one provincial school system to another.  A 2013 Canadian Council of Ministers of Education Canada (CMEC) review of the OECD PISA 2012 Mathematics performance levels revealed the stark achievement inequalities. Four Canadian provinces set the pace – Quebec, British Columbia, and Ontario – and the remaining six are a drag on our average scores. Fully 25% of Prince Edward Island students score Below Level 2, below the OECD average (23%), in Mathematics proficiency. The other provinces with the next highest levels of under-performers were: Manitoba (21%), Newfoundland/Labrador(21%), Nova Scotia (18%), and New Brunswick (16%).

There is no case for complacency in Canada, as pointed out, repeatedly, by Dr. Paul Cappon, former CEO of the Canadian Council on Learning (2005-2011) and our leading expert on comparative international standards. For a “high-achieving” country, Canada has a lower proportion of students who perform at the highest levels of Mathematics on recent PISA tests (CMEC 2013, Figure 1.3, p. 25).  Canada’s 15-year-olds are  increasingly bunched in the mid-range and, when it comes to scoring Level 4 and above on Mathematics,  most score at or below the OECD average of 31 %.  The proportion of high-achievers (Level 4 and above in 2012) was, as follows: PEI (22%); Newfoundland/Labrador (27%); Nova Scotia (28%); Manitoba (28%); Saskatchewan (33%); and Ontario (36%). Mathematics students from Quebec continue to be an exception because 48% of students continue to score Level 4 and above, 17 points above the OECD average score.

Students coming from families with high education levels also tend to do well on the PISA Mathematics tests. The top five OECD countries in this category are Korea (73%), Poland (71%), Japan (68%)Germany (64%) and Switzerland (65%), marginally ahead of the state of Massachusetts at 62%. Five other American states have high-achievement level proficiency rates of 58% or 59%, comparable to Czech Republic (58%) and higher than Canada (57%) and Finland (56%). Canada ranked 12th on this measure, well behind Korea, Poland, Japan, Switzerland and Germany.

Educators professing to be “progressive” in outlook tend to insist that we must cure poverty before we can raise the standards of student performance. More pragmatic educators tend to claim that Canadian schools are doing fine, except for the schools serving our disadvantaged populations, particularly Indigenous and Black children.  Taking a broad, international perspective, it appears that both assumptions are questionable. There are really two achievement gaps to be bridged – one between the affluent/advantaged and the poor/disadvantaged and the other one between Canadian high achievers and their counterparts in the top PISA performing countries.

Does low Socio-Economic Status (SES) marked by child and family poverty set the pattern for student achievement in a deterministic fashion?  To what extent can and do students break out of that mold? How critical are other factors such as better teacher quality, higher curriculum standards, and ingrained ethno-cultural attitudes? Do school systems like Canada and Finland tend to focus on reducing educational inequalities at the expense of challenging their high achievers?  Is this the real reason that many leading western G21 countries continue to lag behind those in Asia? 

Read Full Post »

Older Posts »