Feeds:
Posts
Comments

Posts Tagged ‘OECD Education’

 

The Program of International Student Assessment (PISA), managed by Andreas Schleicher and the OECD Education Skills Office in Paris, France, is still regarded as the “gold standard” in comparative student assessment and enjoys a rather charmed life. Every three years, educational leaders, commentators, and engaged teachers eagerly await the results of student testing and its so called ‘league tables’ ranking the performance of 15-year-olds from some 79 participating jurisdictions. A new book, Dire Straits: Education Reforms, Ideology, Vested Interests and Evidence, produced by two Spanish researchers, Montserrat Gomendio and José Ignacio Wert, is sure to rock that edifice and punch holes in the credibility of the OECD’s education branch.

Student assessment and accountability are essential and yet elusive in global K-12 education, both within countries and internationally, so school reformers put faith in ILSAs like PISA to provide solid evidence on how students were actually performing in the core skills of reading, mathematics and science. Across the globe, educational leaders and policy-makers looked to PISA to provide evidence and guidance to allow us to form a consensus on what works in different countries, and particularly on what can be learned from student achievement gains in top-performing nations. That has not happened according to one of the book’s authors, Montserrat Gomendio, OECD’s former deputy director for education and head of its Centre for Skills. It’s all spelled out in a devasting critique in the current Winter 2023 edition of Education Next.

PISA is OECD Education’s crown jewel in an organization dedicated to providing reliable data and policy advice, encouraging comparative analysis and learning exchanges worldwide.  From the first cycle of PISA (2000) to the last (2018), the number of participating countries increased from a rather homogeneous group of 32 OECD countries to some 79, owing largely to the addition of many low- and middle-income countries. Flush with its own success, the OECD made a boastful claim: “PISA has become the world’s premier yardstick for evaluating the quality, equity and efficiency of school systems, and an influential force for education reform.”

PISA’s own data tells the tale. “After almost two decades of testing, student outcomes have not improved overall in OECD nations or most other participating countries,” according to Gomendio. She recognizes that, up until 2018, a global recession, the rise of social media, and environmental disasters did present “headwinds for school-improvement efforts.” Failing to achieve its mission, she points out, led to “blame games.” That was precipitated by the dawning realization that student outcomes had flatlined from 2000 to 2018. In response, OECD Education officials pointed fingers at its own member states for not taking advantage of the PISA data and carrying out the recommended policy changes.

Policy recommendations from PISA are built upon two different approaches – quantitative analyses of student outcomes and a range of features of education systems and qualitative analyses of low- and top-performing countries. It is commonly agreed PISA’s quantitative analyses of cross-sectional samples and correlations cannot be used to draw causal inferences. It’s qualitative analyses, particularly with regard to Nordic countries, also suffer from serious drawbacks such as cherry-picking. Other weaknesses, she points out in Education Next, have gone largely unnoticed.  One of the biggest question marks is the reliability of student results on such “low stakes” tests. In the case of Australia, for example, the National Council on Educational Research (NCER) found that a majority of Australian students (73%) may not have taken the PISA test seriously and would have invested more effort if it counted towards their marks.

Quality and Equity – Confronting the Contradictions

PISA seeks to measure two complementary dimensions of education systems: quality and equity. Measuring quality on the basis of average student test scores is far easier than assessing equity. To do so, PISA employs a multidimensional concept using metrics such as the relationship between socioeconomic status and student performance, the degree of differences in student performance within and between schools, and many others. None of these variables, Gomendio points out, “tell the full story” and “each of them leads to different conclusions.” So, ultimately PISA’s prism on equity is ultimately too narrow and somewhat unreliable.

PISA’s analysis of school choice and policy recommendations on that issue draw fire from Gomendio and Wert. Claims that students in private schools do not perform better that those in public schools (after correcting for socioeconomic status), are problematic. Analyses lumping private schools together with government-funded, privately managed charter schools skews the results. It also makes it impossible to disaggregate the data. That explains why PISA analyses are at odds with other international assessments, as well as research studies, which show that “school choice often does lead to better student outcomes without necessarily generating segregation.” In addition, the small number of countries with early tracking (streaming into academic and applied/vocational) show “little (if any) differences in student performance and employability rates for vocational-education students.” It is clear that PISA would benefit from thinking outside the box, paying attention to academic research and looking at the broader picture.

            The new book Dire Straits, written by two Spanish researchers, confronts squarely PISA’s implicit bias in favour of Finland and other Nordic countries. The authors are particularly critical of PISA’s analyses of Finland and Germany. In PISA’s first cycle, they call into question the lionizing of Finland for its “quality and equity” and labelling of Germany as a “heavily tracked system” that promoted inequity and “should be avoided.”  

Nordic societies like Finland do get a free ride with PISA because they were egalitarian long before the inception of PISA. Egalitarian societies like Finland possess natural advantages since teachers work with a more uniform student population and are better positioned to implement inclusive policies across the board to all students. More stratified societies in Europe and Latin America, for example, require more differentiated approaches to meet the needs of the full spectrum of students. More recognition should be accorded to stratified societies with income inequalities that tend to have bigger challenges closing the equity gap. In the case of Canada, for example, it is useful to examine how our country manages to maintain reasonable student achievement standards, while alleviating the equity gap, particularly in relation to the United States.

Identifying Exemplars, Applying the Right Lessons

PISA completely missed the boat on the rise of student outcomes in Singapore and its East Asian neighbours and the relative decline of Finland. A few decades ago, Singapore had an illiterate population and very few natural resources. The country made a decision to invest in human capital as the engine of economic growth and prosperity, and, in a few decades, it became the top performer in all international assessment programs. Part of that improvement can be attributed to implementing tracking in primary school in an effort to decrease its high dropout rate. Once this was achieved, the country delayed tracking until the end of primary school. So far, PISA has not provided a coherent and differentiated analysis of the “Singapore Miracle.”

Teacher quality is more salient than PISA recognizes in its analyses. In the case of Singapore and the East Asian countries only top-performing students can enter education-degree programs, whereas poorer performing Latin American countries tend to have teachers drawn from the weaker academic ranks. Professional recruitment programs are mostly weak and teacher evaluation mechanisms almost non-existent.  Teacher unions are not always helpful in improving the quality of instruction.  In the case of Latin America, teacher unions exercise considerable clout and have succeeded in securing lower class sizes, generating more teaching positions. Top-performing East Asian countries, on the other hand, tend to have weaker unions and there are, consequently, fewer political costs involved in running larger class sizes or in implementing rigorous teacher evaluation systems.  Increases in education spending get invested in reducing class sizes, contrary to the PISA recommendation, and in the face of robust evidence that it does not improve student outcomes.

Conclusions

            Ideology, education governance and conflicts of interest all serve to undermine the overall effectiveness of evidence-based, PISA-informed policy prescriptions. Education authorities tend to be risk-averse when it comes to implementing targeted policy prescriptions and resisting the pressures to increase spending levels, driven by the core interests, most notably local education authorities and teacher unions.

Three key lessons jump out in the latest book on PISA. First, decreases in class size and increases in teacher salaries do not work in improving student achievement but such policy recommendations run headlong into the vested interests of unions and the preference of active parents alert to any diminution in the amount of resources received from public funds. Secondly, some influential factors are “strongly context-dependent” (such as school autonomy and site-based management) and are difficult for policymakers to interpret. In such cases, applying policies universally can yield dire consequences. Finally, attempts to measure equity, including those of PISA analysts, tend to be inconclusive and partial, leading to recommendations more often than not “heavily influenced by ideology.”  This has led to a universal recommendation to apply comprehensive policies and avoid those that are regarded as ‘discriminatory’ (such as ability grouping and early tracking). Such policies lead to the worst outcomes in terms of equity among more stratified societies.

Pointing fingers and apportioning blame has become all-too-common in OECD’s highly influential PISA reports.  What’s clear from the latest critique, levelled by the two former PISA insiders, is that flatlined student outcomes and policy shortcomings have much to do with PISA’s implicit biases (ideology), structural impediments (union advocacy), and conflicts of interest (service provider capture). That is why, according to the critics, PISA is failing in its mission.

Judging from the latest book, PISA has made little difference in improving school systems.  Is PISA failing in its mission? With so much evidence from student testing, why do education systems tend to brush aside the runaway success of top-performing Asian countries and, perhaps most importantly, why do so many systems continue to struggle?

 

Read Full Post »

“All that glitters is not gold” is a famous proverb plucked from William Shakespeare‘s play The Merchant of Venice that may well apply to recent international appraisals of K-12 education in Canada. Such rosy assessments tend to put a shiny lustre on what is essentially a sound and ‘pretty good’ school system that has lost ground to competing nations over the past decade.

Five years ago, the Organization for Economic Cooperation and Development(OECD) produced a rather rosy Education Policy Outlook for Canada as part of a series of reports offering comparative analysis of education policies and reforms across the world’s developed countries. Canada’s overall performance, aggregated from widely varied provincial assessment data, looked good, in comparison with the United States, the United Kingdom, and Australia. Most significantly, the OECD assessors brushed aside concerns about “plateaued student achievement” on the Programme of International Student Assessment (PISA) tests and the decline in the proportion of top performing students.

Emerging concerns were most clearly expressed in Dr. Paul Cappon’s final 2010 report for the Canadian Council on Learning. Student scores on the 2009 PISA test had revealed that Canadian 15-year-olds demonstrated relatively strong sets of skills in reading, math and science, but they were already slipping relative to high performing Asian countries and in some cases in absolute terms. “What I’m hoping,” Cappon said at the outset of his final cross-Canada tour, “is that when people realize that Canada is slipping down the international learning curve we’re not going to be able to compete in the future unless we get our act together.”

OECD Education Policy Outlook assessments and Country reports are based upon templates that tend to favour diverse and well-funded school systems like that of Canada. The six identified policy levers in 2015 were: 1) equity and quality of education; 2) preparing students for the future; 3) school improvement; 4) evaluation and assessment; 5) governance; and 6) funding.  Such public policy forecasts, based upon conventional criteria and historic trends, also tend to demonstrate “path dependency” which limits the capacity to capture radical shifts in context or dynamic changes in educational direction.

Fifteen-year-old students in Canada, based upon triennial PISA tests from 2000 to 2018, continue to perform above the OECD average in reading, mathematics and science. Our most economically and socially disadvantaged students, in aggregate, do relatively better than those in competing countries, demonstrating more equity than in most other countries.  A considerably higher proportion of Canadian K-12 students proceed to post-secondary education in universities and colleges. That much has not changed across time.

Three significant changes can be identified from the accumulating OECD student assessment and survey data and they deserve far more critical scrutiny:

Downward Trend in Student Performance:  The performance trends for Canadian fifteen-year-olds are consistently downward from 2000 to 2018 in READING,  from 2003 to 2018 in MATHEMATICS, and from 2006 to 2018 in SCIENCE.  While the OECD average scores are also in decline as more countries are included in PISA, the descent is more pronounced among students from Canada. Students in Canada’s top performing provinces of Alberta, Ontario, British Columbia and Quebec (Mathematics) tend to buoy-up the lagging results produced by students from New Brunswick, Newfoundland/Labrador, Saskatchewan, and Manitoba.

Deteriorating Classroom Disciplinary Climate:

The 2015 Education Policy Outlook for Canada flagged one measure, based upon student survey responses, where Canada simply met the OECD standard – the index of classrooms conducive to learning (Figure 5, OECD Canada, 2015).  That largely undiagnosed problem has worsened over the past three years.  Canada ranked 60th out of 77 participating nations and educational districts in the OECD’s 2018 index of disciplinary climate, released on December 4, 2019.  According to a global student survey conducted in the spring of 2018, one in five students, 15 years-of-age, report that learning time is lost to noise, distractions, and disorder, so much so that it detracts from learning in class. A relatively high proportion of Canadian students say the teacher is not listened to and it takes a long time for the class to settle down. In addition, students regularly skip school and report late to class.

High Incidence of Fear of Failure:

Personal anxieties may also run higher among Canadian students when they confront writing standardized tests and experience a fear of failing the test. In Canada, the OECD 2019 Education GPS report states, “15-year-old students have a strong fear of failure”ranking 6th among 77 national student groups participating in the survey.  Fear of failure runs highest among students in Chinese Taipei, Singapore, Macau, Japan, and Germany, but is less pronounced in high performing countries such as Korea. Estonia, and Finland.  Such fears are present to the same degree among students in the United Kingdom, but less so in the United States.  No analysis whatsoever is offered to explain why fears run so comparatively high among teens in Canada.

The initial report on the Canadian Results of the OECD PISA 2018 Study, released by the Council of Ministers of Education (CMEC) in early December 2019, are of little help in evaluating these rather striking trends.  Like previous reports in the CMEC series, the report puts a positive spin on the aggregate results by casting them within a broad, global context, lumping together countries with radically different commitments to education in terms of spending and resources. It is possible to ferret out anomalies and to conduct province-by-province comparisons, but only with time, effort, and attention to detail. That is sufficient to keep it either buried or accessible only to education assessment specialists.

Does the Canadian Education Policy Outlook ventured in 2015 stand up under close analysis. five years on?  What’s missing from the OECD and CMEC assessment reports for Canada over the past decade?  Should the Canadian public be concerned about the downward trend in the demonstration of core skills in reading, mathematics and science?  Is disciplinary climate now a real concern in Canadian classrooms? And why are Canadian students so afraid of failing in our schools when grade promotion and graduation rates are at record levels?

Read Full Post »

Student achievement varies a great deal across the Organization of Economic Cooperation and Development (OECD) countries. Good teachers can have a significant impact upon their students’ learning and achievement and there is now research to support that contention.  What makes some teachers more effective than others is less clear.  It remains one question that cries out for further in-depth study.

A comprehensive research study reported in the latest issue of Education Next (Vol. 19, Spring 2019) tackles that fundamental question on an international comparative scale. Three American researchers, Eric A Hanushek, Marc Piopiunik, and Simon Wiederhold, not only demonstrate that teachers’ cognitive skills vary widely among developed nations, but that such differences matter greatly for student performance in school.

Developing, recruiting and training a teacher force with higher cognitive skills (Hanushek, Piopiunik, Wiederhold 2019) can be critical in improving student learning. “An increase of one standard deviation in teacher cognitive skills,” they claim, “is associated with an increase of 10 to 15 per cent of a standard deviation in student performance.” Comparing reading and math scores in 31 OECD countries, teachers in Finland come out with the highest cognitive skills. One quarter of the gaps in average student performance across countries would be closed if each of them were to raise the level of teachers’ cognitive skills to that of Finland.

What’s most fascinating about this study is the large role Canadian teachers play in the comparative data analysis for teacher cognitive skills.  Of the 6,402 teacher test-takers in 31 countries, the largest group, 834 (13 per cent), were from Canada. Based upon data gleaned from the OECD Program for the International Assessment of Adult Competencies (PIAAC), we now know where Canadian teachers rank in terms of their numeracy and literacy skills (See Figure 1). We also have a clearer indication of how Canadians with Bachelor’s degrees and Master’s or Doctoral degrees rate in terms of their core cognitive skills.

Teachers from Canada fare reasonably well, in the top third, in the comparative analysis of cognitive skills. In literacy, teachers in Canada perform above average, with a median score of 308 points out of 500 compared to the sample-wide average of 295 points.  If there’s a problem, it’s in terms of numeracy skills, where they perform slightly above the teacher-wide sample with a median score of 293, compared to the average of 292 points. Adult Canadians with Bachelor’s degrees actually outperform teachers in numeracy skills by 7 points. Teachers in Finland and Japan, for example, perform better than Canadians with Master’s or Doctoral degrees.

Since the September 2010 appearance of  the McKInsey & Company study “Closing the talent gap,,” American policy-makers have considered teachers’ own academic performance as “a key predictor” of higher student achievement, based upon teacher recruitment practices in countries that perform well on international tests. High scoring countries like Singapore, Finland and Korea, for example, recruit their teacher force from the top third of their academic cohorts in university.

Securing sound data on the actual quality of recent Canadian teacher education cohorts is challenging because of the paucity of reported information. One claim that Canadian teachers come from the “top one third of high school graduates” put forward in a 2010 McKinsey & Company OECD study looks highly suspect.

A September 2008 review of Initial Teacher Education Programs  (Gambhir, Evans, Broad, Gaskell 2008), reported that admission cut-offs ranged from 65 per cent to over 90 per cent, depending upon the faculty of education. Most of the Canadian universities with Faculty of Education programs, to cite another fact, still have grade cut-off averages for acceptance in the Arts and Science that hover between 70 per cent and 75 per cent. With the exception of OISE, Western, Queen’s and UBC, teacher candidates are not drawn from the top third of their academic cohort, particularly in mathematics and sciences.

Differences in teachers’ cognitive skills within a country also seem to have a bearing upon student performance. Plotting student performance difference between math and reading ( at the country level) against the difference in teacher cognitive skills between numeracy and literacy yields some intriguing results (Figure 2). An increase of teacher cognitive skills of one standard deviation is estimated to improve student achievement by 11 per cent of standard deviation. The data for Canada shows a teacher test-score difference between numeracy and literacy of -12 points

The brand new American study (Hanushek, Piopiunik, Wiederhold 2019) also demonstrates that paying teachers better is a possible factor in attracting and retaining teachers with higher cognitive skills. In terms of wage premiums, teachers’ earnings in higher performing countries are generally higher, as borne out by Ireland, Germany and Korea, where teachers earn 30 to 45 per cent more than comparable college graduates in other jobs.

Teachers in Canada earn 17 per cent more than their comparators, while those in the USA and Sweden earn 22 per cent less. Increasing teacher pay has potential value in the United States where salaries discourage the ‘best and brightest’ from entering teaching. There is a caveat, noted by Hanushek and his research team:  Changes in policy must ensure that “higher salaries go to more effective teachers.”

Do smarter teachers make for smarter students? How sound is the evidence that teachers who know more are actually better teachers? Why do we put so much stock in improving student learning in literacy/reading and mathematics?  What potential flaws can you spot in this type of research? 

 

Read Full Post »

Equipping the rising generation of students with what are termed “21st century skills” is all the rage. Since the fall of 2010, British Columbia’s Ministry of Education, like many other education authorities, has been virtually obsessed with advancing a B.C. Education Plan championing the latest iteration of a global education transformation movement – technology-based personalized learning.

BCEdPlanElements

 

The whole concept of 21st century skills, promoted by the World Economic Forum and the Organization for Economic Cooperation and Development (OECD), rests upon widely-circulated global theories about our fast changing, technology-driven, unpredictable future. Leading proponents of the new dogma contend that it is now essential to ensure that our youth are “equipped with the right type of skills to successfully navigate through an ever-changing, technology-rich work environment’ and ready to “continuously maintain their skills, upskill and/or reskill throughout their working lives.”

Much of the 21st century learning mantra went unchallenged and escaped critical scrutiny until quite recently. Today many of the education researchers challenging the 21st century learning orthodoxy are charter members of researchED, a British grassroots teacher research organization, founded by teacher Tom Bennett five years ago.

A growing number of outstanding education researchers, including Daniel T. Willingham, Dylan Wiliam, and Paul A. Kirschner, have been drawn to researchED rEDONTWillinghamCloseUpbecause of its commitment to scrutinize prevailing theories, expose education myths, and encourage more evidence-informed curriculum policy and teaching practice. That is precisely why I took the lead in bringing researchED to Canada in November 2017.

British Columbia teachers have given the futuristic B.C. Education Plan a cool reception and are, by every indication, ripe for teacher-led research and curriculum changes that pass the evidence-based litmus test.

A 2017 BCTF survey of teachers gave the B.C. Education Plan mixed reviews and has already raised serious concerns about the province’s latest iteration of a “21st century skills” curriculum. Teachers’ concerns over “personalized learning” and “competency-based assessment” focus on the “multiple challenges of implementation” without adequate resource support and technology, but much of the strongest criticism was motivated by “confusion” over its purposes, concern over the lack of supporting research, and fears that it would lead to “a less rigorous academic curriculum.”

Such criticisms are well-founded and consistent with new academic research widely discussed in researchED circles and now finding its way into peer-reviewed education Vo Raad/Magazine, Blik van Buiten, Paul Kirschner, Heerlen, 12 12 2013research journals. Professor Paul A. Kirschner and his Open University of the Netherlands team are in the forefront in the movement to interrogate the claims and construct an alternative approach to preparing our children for future success.

Research-informed educators are now asking whether the so-called 21st century skills actually exist. If these skills do exist, to what extent are they new or just repackaged from previous generations of attempted reform.  Why, they ask, have the number of identified skills ballooned from four in 2009 (Partnership for 21st Century Skills) to 16 in 2016 (World Economic Forum).

What students need – and most teachers actually want – is what Kirschner has termed “future-proof education.” Based upon recent cognitive science research, he and others are urging teachers to take action themselves to ensure that evidence-informed practice wins the day.

The best way forward may well be deceptively simple: set aside the “21st century skills” paradigm in favour of the acquisition of knowledge, skills, and attitudes necessary to continue to learn in a stable and enduring way in a rapidly changing world.”

Kirschner and his research team propose a new “future-proof” basis for preparing students for success and fulfillment: 

  1. Cognitive and metacognitive skills are critical. Five of the identified GCM clusters emphasize such skills and suggest emphasizing a progression from concrete cognitive skills to more generic personality competencies.
  2. Authentic learning situations should be a high priority and the driving force for teaching, training, and learning. Such tasks help learners to integrate knowledge, skills, and attitudes, stimulate them to develop coordinate skills, and facilitate transfer of what is learned to new problem situations.
  3. Redesigning schools and professionalizing teachers in 21st century learning strategies are not likely to make much of a difference. Shift the focus to cognitive and metacognitive skills, linking learning with authentic, real-life situations and matching teaching methods with educational contexts and goals.

DidauTaxonomyRushing head-long into 21st century skills makes little sense to Kirschner and fellow researchers because the most effective and durable initiatives are those that are planned and staged over a longer span of as much as 15 years. He proposes a three-stage approach: 1) laying the building blocks (i.e., concrete cognitive knowledge and skills);  2) develop higher-order thinking and working skills; and 3) tackle Bigger Problems that require metacognitive competencies and skills. Much of the underlying research is neatly summarized in David Didau’s 2017 Taxonomy demonstrating the connection between long term memory and working memory in teaching and learning.

All of this is just a small taste of my upcoming researchED Vancouver 2019 presentation on the B.C. Education Plan.  It will not only analyze the B.C. version of 21st Century Learning, but attempt to point the province’s education system in the right direction.

Where did the “21st century skills” movement actually originate?  Where’s the evidence-based research to support 21st century skills projects such as the B’C. Education Plan? How much of the Plan is driven by the imperatives of technology-based personalized learning and its purveyors? Can you successfully prepare students for careers and jobs that don’t exist? Would we be better advised to abandon “21st century skills” in favour of “future-proof learning”? 

Read Full Post »

Millions of Facebook users were profiled by Cambridge Analytica without their knowledge and that public disclosure has heightened everyone’s awareness of not only the trend to “personality profiling,’ but the potential for massive invasion of privacy. These controversial actions have exposed the scope of Big Data and the wider aspirations of the data analytics industry to probe into the “hidden depths of people.” It has also, as U.K. expert Ben Williamson has reminded us, tipped us off about the growing trend toward personality measurement in K-12 and post-secondary education.

Williamson’s 2017 book, Big Data in Education, sounded the alert that the collection and analysis of more personal information from schoolchildren will be a defining feature of education in coming years. And just as the Facebook debacle raises public concerns about the use of personal data, a new international test of ten and 15-year-olds is to be introduced by the Organization for Economic Cooperation and Development (OECD) – a powerful influence on national education policies at a global scale.  Almost without being detected, it is also emerging as a key component of the current Ontario Student “Well-Being” Assessment, initially piloted from 2014 to 2016 by Ontario People for Education as the core objective of its Measuring What Matters project.

Most data collected about students since the 1990s has came from conventional international, national and provincial examinations of knowledge and cognitive skills. Preparing students for success in the 21st century workplace has been a major driver of most initiatives in testing and accountability.  International test results such as OECD’s Program of International Student Assessment (PISA) have also become surrogate measures of the future economic potential of nations, feeding a global education race among national education systems.

The advent of Big Data is gradually transforming the nature of student assessment. While the initial phase was focused on stimulating competitive instincts and striving for excellence, more recent initiatives are seeking to “broaden the focus of student assessment” to include what is termed “social and emotional learning (SEL).” Much of the motivation is to secure some economic advantage, but that is now being more broadly defined to help mould students committed to more than individual competitiveness.  With the capacity to collect more “intimate” data about social and emotional skills to measure personality, education policymakers are devising curriculum and assessment programmes to improve personality scores. Despite the Cambridge Analytica controversy, personality data is well on the way to being used in education to achieve a variety of competing political objectives.

The ‘Big Five’ of Personality Profiling

The science of the psychographic profiling employed by Cambridge Analytica is hotly contested. It is, however, based on psychological methods that have a long history for measuring and categorizing people by personality. At its core is a psychological model called the “five factor model” of personality – or the “Big Five.” These include “openness”, “conscientiousness”, “extroversion”, “agreeableness” and “neuroticism” (OCEAN). Personality theorists believe these categories are suitable for classifying the full range of human personalities. Psychologists have invented instruments such as the so-called ‘Big Five Inventory’  to capture OCEAN data for personality modelling.

Advent of Stealth Assessment

The upcoming 2018 OECD PISA test will include, for the first time, a battery of questions aimed at assessing “global competencies” with a distinct SEL orientation. In 2019, the OECD plans to launch its international Study of Social and Emotional Learning  Designed as a computer-based self-completion questionnaire, at its core the test is a modified version of the Big Five Inventory. The OECD version maps exactly onto the five factor personality categories with “emotional stability” substituted in place of “neuroticism.” When implemented, the social and emotional skills test will assess students against each of the Big Five categories.

The OECD Education Skills experts, working in collaboration with Pearson International, firmly believe that social and emotional skills are important predictors of educational progress and future workplace performance. Large-scale personality data is clearly seen by the OECD to be predictive of a country’s potential social and economic progress. Although both the OECD and the Ontario Student Well-Being advocates both claim that it is strictly a test of social and emotional skills, Williamson claims such projects employ the same family of methods used in the Cambridge Analytica personality quiz. Upon closer examination, the same psychological assumptions and personality assessment methods underpin most of the latest education ventures.

The OECD is already a powerful influence on the moulding of national education policies. Its PISA testing has reshaped school curricula, assessments and whole systems in the global education race.  It is increasingly likely that its emphasis on personality testing will, once again, reshape education policy and school practices. Just as PISA has influenced a global market in products to support the core skills of literacy, numeracy and science tested by the assessment, the same is now occurring around SEL and personality development.  Canada’s provincial and territorial ministers of education, working under the auspices of the Council of Ministers of Education, Canada (CMEC) have not only endorsed the OECD’s  proposed “global competencies,” but proposed a variation of their own to guide assessment policy.

The Ontario Student Assessment initiative, announced September 6, 2017, deserves closer scrutiny through the lens of datafication and personality profiling. It’s overarching goal bears repeating: “Update provincial assessment and reporting practices, including EQAO, to make sure they are culturally relevant, measure a wider range of learning, and better reflect student well-being and equity.”  Founder of People for Education Annie Kidder hailed the plan for “embedding” the “transferable skills” and positioning Ontario to take “a leading role in the global movement toward broader goals for education and broader measures of success in our schools.”

Critics of large-scale student assessments are quick to identify the underlying influence of “globalization” and the oft-stated goal  of preparing students for the highly competitive “21st century workplace.”  It can be harder to spot currents moving in the opposite direction and heavily influenced by what Kathryn Ecclestone and Denis Hayes aptly termed the “therapeutic education ethos.” Ten years ago, they flagged the rise of  a “therapeutic education” movement exemplified by classroom activities and programs, often branded as promoting ‘mindfulness,’ which pave the way for “coaching appropriate emotions” and transform education into a disguised form of “social engineering” aimed at producing “emotionally literate citizens” who are notably “happy” and experience “emotional well-being.”

Preparing students to be highly competitive human beings or to be creative and cooperative individuals is risking re-framing public education in terms of personality modification, driven by ideological motivations, rather than the pursuit of meaningful knowledge and understanding. It treats children as ‘guinea pigs’ engaged in either market competition preparation or social engineering, and may well stand in the way of classroom teachers pursuing their own evidence-based, knowledge-centred curriculum aims.

Appropriating and misusing personality data by Facebook and Cambridge Analytica led to a significant world-wide public backlash. In education, however, tests and technologies to measure student personality, according to Williamson, are passing unchallenged. It is equally controversial to capture and mine students’ personality data with the goal of shaping students to “fit into” the evolving global marketplace.  Stealth assessment has arrived and being forewarned is forearmed.

Why is education embracing data mining and personality profiling for schoolchildren? What are the connections between Facebook data mining and recent social-and-emotional learning assessment initiatives?  Should students and parents be advised, in advance, when student data is being minded and mapped against personality types?  Why have Canadian assessment projects like the Ontario Measuring What Matters- Student Well-Being initiative escaped close scrutiny?  Should we be more vigilant in tracking and monitoring the use and abuse of Big Data in education? 

Read Full Post »

“Canadians can be proud of our showing in the 2015 Programme for International Student Assessment (PISA) report,” declared Science consultant Bonnie Schmidt and former Council of Ministers of Education (CMEC) director Andrew Parkin in their first-off-the mark December 6, 2016 response to the results. “We are, ” they added, “one of only a handful of countries that places in the top tier of the Oganization for Economic Development and Cooperation (OECD) in each of the three subjects tested:science, reading and math.”

pisa2015cmeccover“Canada” and “Canadian students,” we were told, were once again riding high in the once-every-three-years international test sweepstakes. If that that effusively positive response had a familiar ring, it was because it followed the official line advanced by a markedly similar CMEC media release, issued a few hours before the commentary.

Since our students, all students in each of our ten provincial school systems, were “excelling,” then it was time for a little national back-slapping. There’s one problem with that blanket analysis: it serves to maintain the status quo, engender complacency, obscure the critical Mathematics scores, and disguise the lopsided nature of student performance from region to region.

Hold on, not so fast, CMEC — the devil is in the real details and more clearly portrayed in the OECD’s own “Country Profile” for Canada. Yes, 15-year-olds in three Canadian provinces (Alberta, British Columbia, and Quebec) achieved some excellent results, but overall Mathematics scores were down, and students in over half of our provinces trailed-off into mediocrity in terms of performance. Our real success was not in performance, but rather in reducing the achievement gap adversely affecting disadvantaged students.

Over half a million 15-year-olds in more than 72 jurisdictions all over the world completed PISA tests, and Schmidt and Parkin were not alone in making sweeping pronouncements about why Canada and other countries are up and others down in the global rankings.

Talking in aggregate terms about the PISA performance of 20,000 Canadian students in ten different provinces can be, and is, misleading, when the performance results in mathematics continue to lag, Ontario students continue to underperform, and students in two provinces, Manitoba and Saskatchewan, struggle in science, reading, and mathematics.  Explaining all that away is what breeds complacency in the school system.

My own PISA 2015 forecast was way off-base — and taught me a lesson.  After the recent TIMSS 2015 Mathematics results released in November 2016, an  East Asian sweep, led by Singapore and Korea, seemed like a safe bet. How Finland performs also attracts far less attention than it did in its halcyon days back in 2003 and 2006. The significant OECD pivot away from excellence to equity caught me napping and I completely missed the significance of moving (2012 to 2015) from pencil-and-paper to computer-based tests. 

Some solace can be found in the erroneous forcecasts of others. The  recent Alberta Teachers’ Association (ATA) “Brace Yourself” memo with its critique of standardized testing assessment, seemed to forecast a calamitous drop in Alberta student performance levels. It only happened in Mathematics.

Advocates of the ‘Well-Being’ curriculum and broader assessment measures, championed by Toronto’s People for Education, will likely be temporarily thrown off-stride by the OECD’s new-found commitment to assessing equity in education. It will be harder now to paint PISA as evil and to discredit PISA results based upon such a narrow range of skills in reading, math and science.

The OECD’s “Country Profile” of Canada is worth studying carefully because it aggregates data from 2003 to 2015, clarifies the trends, and shows how Canadian students continue to struggle in mathematics far more than in reading and science.

Canadian students may have finished 12th in Mathematics with a 516 aggregate score, but the trend line continues to be in decline, down from 532 in 2003. Digging deeper, we see that students in only two provinces, Quebec ( 544) and BC (522) actually exceeded the national mean score. Canada’s former leader in Mathematics performance, Alberta, continued its downward spiral from the lofty heights of 549 (2003) to 511 (2015).

Since Ontario students’ provincial mathematics scores are declining, experts will be pouring over the latest PISA results to see how bad it is in relation to the world’s top performing systems. No surprises here: Ontario students scored 509, finishing 4th in Canada, and down from 530 on PISA 2003. Excellence will require a significant change in direction.

The biggest discovery in post-2015 PISA analysis was the positive link between explicit instruction and higher achievement in the 2015 core assessment subject, science. The most important factor linked with high performance remains SES (soci0-economic status), but teacher-guided instruction was weighted close behind and students taught with minimal direction, in inquiry or project-based classes, simply performed less well on the global test.

The results of the 15-year-olds are largely determined over 10 years of schooling, and not necessarily the direct consequence of the latest curriculum fad such as “discovery math.’’

It’s better to look deeper into what this cohort of students were learning when they first entered the school system, in the mid-1990s. In the case of Canadian students, for example, student-centred learning was at its height, and the country was just awakening to the value of testing to determine what students were actually learning in class.

Where the student results are outstanding, such as Singapore and Estonia, it is not solely attributable to the excellence of teaching or the rigour of the math and science curriculum.

We know from the “tutoring explosion” in Canada’s major cities that the prevalence of private tuition classes after school is a contributing factor, and may explain the current advantage still enjoyed in mathematics by Pacific Rim students.

Children of Chinese heritage in Australia actually outperformed students in Shanghai on the 2012 PISA test, and we need to explore whether that may be true for their counterparts in Greater Vancouver. The so-called “Shanghai Effect” may be attributed as much to “tiger mothers” as it is to the quality of classroom instruction.

Whether Canada and Canadians continue to exhibit high PISA self-esteem or have simply plateaued does not matter as much as what we glean over the next few years from studying best international practice in teaching, learning, and assessment.

Surveying PISA student results, this much is clear: standing still is not an option in view of the profound changes that are taking place in life, work, and society.

 

Read Full Post »

With the release of the 2015 Program for International Student Assessment (PISA) on the horizon,  the Organization for Economic Cooperation and Development (OECD) Education Office has stoked-up the “Math Wars” with a new study. While the October 2016 report examines a number of key questions related to teaching Mathematics, OECD Education chose to highlight its findings on “memorization,” presumably to dispel perceptions about “classroom drill” and its use in various countries.

mathsubtractionboardThe OECD, which administers the PISA assessments every three years to 15-year-olds from around the globe, periodically publishes reports looking at slices of the data. It’s most October 2016 report,  Ten Questions for Mathematics Teachers and How PISA Can Help Answer Them, based upon the most recent 2012 results, tends to zero-in on “memorization” and attempts to show that high-performing territories, like Shanghai-China, Korea, and Chinese-Taipei, rely less on memory work than lower-performing places like Ireland, the UK, and Australia.

American Mathematics educator Jo Boaler, renowned for “Creative Math,” jumped upon the PISA Study to buttress her case  against “memorization” in elementary classrooms. In a highly contentious November 2016 Scientific American article, Boaler and co-author Pablo Zoido, contended that PISA findings confirmed that “memorizers turned out to be the lowest achievers, and countries with high numbers of them—the U.S. was in the top third—also had the highest proportion of teens doing poorly on the PISA math assessment.” Students who relied on memorization, they further argued, were “approximately half a year behind students who used relational and self-monitoring strategies” such as those in Japan and France. 

Australian education researcher Greg Ashman took a closer look at the PISA Study and called into question such hasty interpretations of the findings.  Figure 1.2: How teachers teach and students learn caught his eye and he went to work interrogating the survey responses on “memorization” and the axes used to present the data.  The PISA analysis, he discovered, also did not include an assessment of how teaching methods might be correlated with PISA scores in Mathematics.  Manitoba Mathematics professor Robert Craigen spotted a giant hole in the PISA analysis and noted that the “memorization” data related to “at-home strategies of students” not their instructional experiences and may wel;l indicate that students who are improperly instructed in class resort to memorization on their own.

mathpisateacherdirectedgraphWhat would it look like, Ashman wondered, if the PISA report had plotted how students performed in relation to the preferred methods used on the continuum from “more student-oriented instruction” to “more teacher-directed instruction.” Breaking down all the data, he generated a new graph that actually showed how teaching method correlated with higher math performance and found a “positive correlation” between teacher-directed instruction and higher Math scores. “Correlations,” he duly noted, “do not necessarily imply causal relationships but clearly a higher ratio of teacher-directed activity to student orientation.”

Jumping on the latest research to seek justification for her own “meta-beliefs” are normal practice for Boaler and her “Discovery Math” education disciples. After junking, once again, the ‘strawmen’ of traditional Mathematics — “rote memorization” and “drill,” Boaler and Zoido wax philosophical and poetic: “If American classrooms begin to present the subject as one of open, visual, creative inquiry, accompanied by growth-mindset messages, more students will engage with math’s real beauty. PISA scores would rise, and, more important, our society could better tap the unlimited mathematical potential of our children.” That’s definitely stretching the evidence far beyond the breaking point.

The “Math Wars” do generate what University of Virginia psychologist Daniel T. Willingham has aptly described as “a fair amount of caricature.” The recent Boaler-Zoido Scientific American article is a prime example of that tendency. Most serious scholars of cognition tend to support the common ground position that learning mathematics requires three distinct types of knowledge: factual, procedural and conceptual. “Factual knowledge,” Willingham points out, “includes having already in memory the answers to a small set of problems of addition, subtraction, multiplication, and division.” While some students can learn Mathematics through invented strategies, it cannot be relied upon for all children. On the other hand, knowledge of procedures is no guarantee of conceptual understanding, particularly when it comes to complexites such as dividing fractions. It’s clear to most sensible observers that knowing math facts, procedures and concepts is  what counts when it comes to mastering mathematics.

mathtimestableimageSimply ignoring research that contradicts your ‘meta-beliefs’ is common on the Math Education battlefield. Recent academic research on “memorization” that contradicts Boaler and her entourage, is simply ignored, even that emanating from her own university. Two years ago, Shaozheng Qin and Vinod Menon of Stanford University Medical School led a team that provided scientifically-validated evidence that “rote memorization” plays a critical role in building capacity to solve complex calculations.

Based upon a clinical study of 68 children, aged 7 to 9, studied over the course of one year, their 2014 Nature Neuroscience study, Qin, Menon et al. found that memorizing the answers to simple math problems, such as basic addition or multiplication, forms a key step in a child’s cognitive development, helping bridge the gap between counting on fingers and tackling more complex calculations. Memorizing the basics, they concluded, is the gateway to activating the “hippocampus,” a key brain structure for memory, which gradually expands in “overlapping waves” to accommodate the greater demands of more complex math.

The whole debate over memorization is suspect because of the imprecision in the use of the term. Practice, drilling, and memorization are not the same, even though they get conflated in Jo Boaler’s work and in much of the current Mathematics Education literature. Back in July 2012, D.T. Willingham made this crucial point and provided some valuable points of distinction. “Practice,” as defined by Anders Ericsson, involves performing tasks and feedback on that performance, executed for the purpose of improvement. “Drilling’ connotes repetition for the purpose of achieving automaticity, which – at its worst, amounts to mindless repetition or parroting. “Memorization,” on the other hand, relates to the goal of something ending up in long-term memory with ready access, but does not imply using any particular method to achieve that goal.

Memorization has become a dirty word in teaching and learning laden with so much baggage to the point where it conjures up mental pictures of “drill and kill” in the classroom. The 2016 PISA Study appears to perpetuate such stereotyping and, worst of all, completely misses the “positive correlation” between teacher-directed or explicit instruction and better performance in mathematics.

Why does the PISA Study tend to associate memorization in home-study settings with the drudgery of drill in the classroom?  To what extent does the PISA Study on Mathematics Teaching support the claims made by Jo Boaler and her ‘Discovery Math’ advocates? When it comes to assessing the most effective teaching methods, why did the PISA researchers essentially take a pass? 

 

Read Full Post »