Feeds:
Posts
Comments

Posts Tagged ‘OECD Education’

Millions of Facebook users were profiled by Cambridge Analytica without their knowledge and that public disclosure has heightened everyone’s awareness of not only the trend to “personality profiling,’ but the potential for massive invasion of privacy. These controversial actions have exposed the scope of Big Data and the wider aspirations of the data analytics industry to probe into the “hidden depths of people.” It has also, as U.K. expert Ben Williamson has reminded us, tipped us off about the growing trend toward personality measurement in K-12 and post-secondary education.

Williamson’s 2017 book, Big Data in Education, sounded the alert that the collection and analysis of more personal information from schoolchildren will be a defining feature of education in coming years. And just as the Facebook debacle raises public concerns about the use of personal data, a new international test of ten and 15-year-olds is to be introduced by the Organization for Economic Cooperation and Development (OECD) – a powerful influence on national education policies at a global scale.  Almost without being detected, it is also emerging as a key component of the current Ontario Student “Well-Being” Assessment, initially piloted from 2014 to 2016 by Ontario People for Education as the core objective of its Measuring What Matters project.

Most data collected about students since the 1990s has came from conventional international, national and provincial examinations of knowledge and cognitive skills. Preparing students for success in the 21st century workplace has been a major driver of most initiatives in testing and accountability.  International test results such as OECD’s Program of International Student Assessment (PISA) have also become surrogate measures of the future economic potential of nations, feeding a global education race among national education systems.

The advent of Big Data is gradually transforming the nature of student assessment. While the initial phase was focused on stimulating competitive instincts and striving for excellence, more recent initiatives are seeking to “broaden the focus of student assessment” to include what is termed “social and emotional learning (SEL).” Much of the motivation is to secure some economic advantage, but that is now being more broadly defined to help mould students committed to more than individual competitiveness.  With the capacity to collect more “intimate” data about social and emotional skills to measure personality, education policymakers are devising curriculum and assessment programmes to improve personality scores. Despite the Cambridge Analytica controversy, personality data is well on the way to being used in education to achieve a variety of competing political objectives.

The ‘Big Five’ of Personality Profiling

The science of the psychographic profiling employed by Cambridge Analytica is hotly contested. It is, however, based on psychological methods that have a long history for measuring and categorizing people by personality. At its core is a psychological model called the “five factor model” of personality – or the “Big Five.” These include “openness”, “conscientiousness”, “extroversion”, “agreeableness” and “neuroticism” (OCEAN). Personality theorists believe these categories are suitable for classifying the full range of human personalities. Psychologists have invented instruments such as the so-called ‘Big Five Inventory’  to capture OCEAN data for personality modelling.

Advent of Stealth Assessment

The upcoming 2018 OECD PISA test will include, for the first time, a battery of questions aimed at assessing “global competencies” with a distinct SEL orientation. In 2019, the OECD plans to launch its international Study of Social and Emotional Learning  Designed as a computer-based self-completion questionnaire, at its core the test is a modified version of the Big Five Inventory. The OECD version maps exactly onto the five factor personality categories with “emotional stability” substituted in place of “neuroticism.” When implemented, the social and emotional skills test will assess students against each of the Big Five categories.

The OECD Education Skills experts, working in collaboration with Pearson International, firmly believe that social and emotional skills are important predictors of educational progress and future workplace performance. Large-scale personality data is clearly seen by the OECD to be predictive of a country’s potential social and economic progress. Although both the OECD and the Ontario Student Well-Being advocates both claim that it is strictly a test of social and emotional skills, Williamson claims such projects employ the same family of methods used in the Cambridge Analytica personality quiz. Upon closer examination, the same psychological assumptions and personality assessment methods underpin most of the latest education ventures.

The OECD is already a powerful influence on the moulding of national education policies. Its PISA testing has reshaped school curricula, assessments and whole systems in the global education race.  It is increasingly likely that its emphasis on personality testing will, once again, reshape education policy and school practices. Just as PISA has influenced a global market in products to support the core skills of literacy, numeracy and science tested by the assessment, the same is now occurring around SEL and personality development.  Canada’s provincial and territorial ministers of education, working under the auspices of the Council of Ministers of Education, Canada (CMEC) have not only endorsed the OECD’s  proposed “global competencies,” but proposed a variation of their own to guide assessment policy.

The Ontario Student Assessment initiative, announced September 6, 2017, deserves closer scrutiny through the lens of datafication and personality profiling. It’s overarching goal bears repeating: “Update provincial assessment and reporting practices, including EQAO, to make sure they are culturally relevant, measure a wider range of learning, and better reflect student well-being and equity.”  Founder of People for Education Annie Kidder hailed the plan for “embedding” the “transferable skills” and positioning Ontario to take “a leading role in the global movement toward broader goals for education and broader measures of success in our schools.”

Critics of large-scale student assessments are quick to identify the underlying influence of “globalization” and the oft-stated goal  of preparing students for the highly competitive “21st century workplace.”  It can be harder to spot currents moving in the opposite direction and heavily influenced by what Kathryn Ecclestone and Denis Hayes aptly termed the “therapeutic education ethos.” Ten years ago, they flagged the rise of  a “therapeutic education” movement exemplified by classroom activities and programs, often branded as promoting ‘mindfulness,’ which pave the way for “coaching appropriate emotions” and transform education into a disguised form of “social engineering” aimed at producing “emotionally literate citizens” who are notably “happy” and experience “emotional well-being.”

Preparing students to be highly competitive human beings or to be creative and cooperative individuals is risking re-framing public education in terms of personality modification, driven by ideological motivations, rather than the pursuit of meaningful knowledge and understanding. It treats children as ‘guinea pigs’ engaged in either market competition preparation or social engineering, and may well stand in the way of classroom teachers pursuing their own evidence-based, knowledge-centred curriculum aims.

Appropriating and misusing personality data by Facebook and Cambridge Analytica led to a significant world-wide public backlash. In education, however, tests and technologies to measure student personality, according to Williamson, are passing unchallenged. It is equally controversial to capture and mine students’ personality data with the goal of shaping students to “fit into” the evolving global marketplace.  Stealth assessment has arrived and being forewarned is forearmed.

Why is education embracing data mining and personality profiling for schoolchildren? What are the connections between Facebook data mining and recent social-and-emotional learning assessment initiatives?  Should students and parents be advised, in advance, when student data is being minded and mapped against personality types?  Why have Canadian assessment projects like the Ontario Measuring What Matters- Student Well-Being initiative escaped close scrutiny?  Should we be more vigilant in tracking and monitoring the use and abuse of Big Data in education? 

Advertisements

Read Full Post »

“Canadians can be proud of our showing in the 2015 Programme for International Student Assessment (PISA) report,” declared Science consultant Bonnie Schmidt and former Council of Ministers of Education (CMEC) director Andrew Parkin in their first-off-the mark December 6, 2016 response to the results. “We are, ” they added, “one of only a handful of countries that places in the top tier of the Oganization for Economic Development and Cooperation (OECD) in each of the three subjects tested:science, reading and math.”

pisa2015cmeccover“Canada” and “Canadian students,” we were told, were once again riding high in the once-every-three-years international test sweepstakes. If that that effusively positive response had a familiar ring, it was because it followed the official line advanced by a markedly similar CMEC media release, issued a few hours before the commentary.

Since our students, all students in each of our ten provincial school systems, were “excelling,” then it was time for a little national back-slapping. There’s one problem with that blanket analysis: it serves to maintain the status quo, engender complacency, obscure the critical Mathematics scores, and disguise the lopsided nature of student performance from region to region.

Hold on, not so fast, CMEC — the devil is in the real details and more clearly portrayed in the OECD’s own “Country Profile” for Canada. Yes, 15-year-olds in three Canadian provinces (Alberta, British Columbia, and Quebec) achieved some excellent results, but overall Mathematics scores were down, and students in over half of our provinces trailed-off into mediocrity in terms of performance. Our real success was not in performance, but rather in reducing the achievement gap adversely affecting disadvantaged students.

Over half a million 15-year-olds in more than 72 jurisdictions all over the world completed PISA tests, and Schmidt and Parkin were not alone in making sweeping pronouncements about why Canada and other countries are up and others down in the global rankings.

Talking in aggregate terms about the PISA performance of 20,000 Canadian students in ten different provinces can be, and is, misleading, when the performance results in mathematics continue to lag, Ontario students continue to underperform, and students in two provinces, Manitoba and Saskatchewan, struggle in science, reading, and mathematics.  Explaining all that away is what breeds complacency in the school system.

My own PISA 2015 forecast was way off-base — and taught me a lesson.  After the recent TIMSS 2015 Mathematics results released in November 2016, an  East Asian sweep, led by Singapore and Korea, seemed like a safe bet. How Finland performs also attracts far less attention than it did in its halcyon days back in 2003 and 2006. The significant OECD pivot away from excellence to equity caught me napping and I completely missed the significance of moving (2012 to 2015) from pencil-and-paper to computer-based tests. 

Some solace can be found in the erroneous forcecasts of others. The  recent Alberta Teachers’ Association (ATA) “Brace Yourself” memo with its critique of standardized testing assessment, seemed to forecast a calamitous drop in Alberta student performance levels. It only happened in Mathematics.

Advocates of the ‘Well-Being’ curriculum and broader assessment measures, championed by Toronto’s People for Education, will likely be temporarily thrown off-stride by the OECD’s new-found commitment to assessing equity in education. It will be harder now to paint PISA as evil and to discredit PISA results based upon such a narrow range of skills in reading, math and science.

The OECD’s “Country Profile” of Canada is worth studying carefully because it aggregates data from 2003 to 2015, clarifies the trends, and shows how Canadian students continue to struggle in mathematics far more than in reading and science.

Canadian students may have finished 12th in Mathematics with a 516 aggregate score, but the trend line continues to be in decline, down from 532 in 2003. Digging deeper, we see that students in only two provinces, Quebec ( 544) and BC (522) actually exceeded the national mean score. Canada’s former leader in Mathematics performance, Alberta, continued its downward spiral from the lofty heights of 549 (2003) to 511 (2015).

Since Ontario students’ provincial mathematics scores are declining, experts will be pouring over the latest PISA results to see how bad it is in relation to the world’s top performing systems. No surprises here: Ontario students scored 509, finishing 4th in Canada, and down from 530 on PISA 2003. Excellence will require a significant change in direction.

The biggest discovery in post-2015 PISA analysis was the positive link between explicit instruction and higher achievement in the 2015 core assessment subject, science. The most important factor linked with high performance remains SES (soci0-economic status), but teacher-guided instruction was weighted close behind and students taught with minimal direction, in inquiry or project-based classes, simply performed less well on the global test.

The results of the 15-year-olds are largely determined over 10 years of schooling, and not necessarily the direct consequence of the latest curriculum fad such as “discovery math.’’

It’s better to look deeper into what this cohort of students were learning when they first entered the school system, in the mid-1990s. In the case of Canadian students, for example, student-centred learning was at its height, and the country was just awakening to the value of testing to determine what students were actually learning in class.

Where the student results are outstanding, such as Singapore and Estonia, it is not solely attributable to the excellence of teaching or the rigour of the math and science curriculum.

We know from the “tutoring explosion” in Canada’s major cities that the prevalence of private tuition classes after school is a contributing factor, and may explain the current advantage still enjoyed in mathematics by Pacific Rim students.

Children of Chinese heritage in Australia actually outperformed students in Shanghai on the 2012 PISA test, and we need to explore whether that may be true for their counterparts in Greater Vancouver. The so-called “Shanghai Effect” may be attributed as much to “tiger mothers” as it is to the quality of classroom instruction.

Whether Canada and Canadians continue to exhibit high PISA self-esteem or have simply plateaued does not matter as much as what we glean over the next few years from studying best international practice in teaching, learning, and assessment.

Surveying PISA student results, this much is clear: standing still is not an option in view of the profound changes that are taking place in life, work, and society.

 

Read Full Post »

With the release of the 2015 Program for International Student Assessment (PISA) on the horizon,  the Organization for Economic Cooperation and Development (OECD) Education Office has stoked-up the “Math Wars” with a new study. While the October 2016 report examines a number of key questions related to teaching Mathematics, OECD Education chose to highlight its findings on “memorization,” presumably to dispel perceptions about “classroom drill” and its use in various countries.

mathsubtractionboardThe OECD, which administers the PISA assessments every three years to 15-year-olds from around the globe, periodically publishes reports looking at slices of the data. It’s most October 2016 report,  Ten Questions for Mathematics Teachers and How PISA Can Help Answer Them, based upon the most recent 2012 results, tends to zero-in on “memorization” and attempts to show that high-performing territories, like Shanghai-China, Korea, and Chinese-Taipei, rely less on memory work than lower-performing places like Ireland, the UK, and Australia.

American Mathematics educator Jo Boaler, renowned for “Creative Math,” jumped upon the PISA Study to buttress her case  against “memorization” in elementary classrooms. In a highly contentious November 2016 Scientific American article, Boaler and co-author Pablo Zoido, contended that PISA findings confirmed that “memorizers turned out to be the lowest achievers, and countries with high numbers of them—the U.S. was in the top third—also had the highest proportion of teens doing poorly on the PISA math assessment.” Students who relied on memorization, they further argued, were “approximately half a year behind students who used relational and self-monitoring strategies” such as those in Japan and France. 

Australian education researcher Greg Ashman took a closer look at the PISA Study and called into question such hasty interpretations of the findings.  Figure 1.2: How teachers teach and students learn caught his eye and he went to work interrogating the survey responses on “memorization” and the axes used to present the data.  The PISA analysis, he discovered, also did not include an assessment of how teaching methods might be correlated with PISA scores in Mathematics.  Manitoba Mathematics professor Robert Craigen spotted a giant hole in the PISA analysis and noted that the “memorization” data related to “at-home strategies of students” not their instructional experiences and may wel;l indicate that students who are improperly instructed in class resort to memorization on their own.

mathpisateacherdirectedgraphWhat would it look like, Ashman wondered, if the PISA report had plotted how students performed in relation to the preferred methods used on the continuum from “more student-oriented instruction” to “more teacher-directed instruction.” Breaking down all the data, he generated a new graph that actually showed how teaching method correlated with higher math performance and found a “positive correlation” between teacher-directed instruction and higher Math scores. “Correlations,” he duly noted, “do not necessarily imply causal relationships but clearly a higher ratio of teacher-directed activity to student orientation.”

Jumping on the latest research to seek justification for her own “meta-beliefs” are normal practice for Boaler and her “Discovery Math” education disciples. After junking, once again, the ‘strawmen’ of traditional Mathematics — “rote memorization” and “drill,” Boaler and Zoido wax philosophical and poetic: “If American classrooms begin to present the subject as one of open, visual, creative inquiry, accompanied by growth-mindset messages, more students will engage with math’s real beauty. PISA scores would rise, and, more important, our society could better tap the unlimited mathematical potential of our children.” That’s definitely stretching the evidence far beyond the breaking point.

The “Math Wars” do generate what University of Virginia psychologist Daniel T. Willingham has aptly described as “a fair amount of caricature.” The recent Boaler-Zoido Scientific American article is a prime example of that tendency. Most serious scholars of cognition tend to support the common ground position that learning mathematics requires three distinct types of knowledge: factual, procedural and conceptual. “Factual knowledge,” Willingham points out, “includes having already in memory the answers to a small set of problems of addition, subtraction, multiplication, and division.” While some students can learn Mathematics through invented strategies, it cannot be relied upon for all children. On the other hand, knowledge of procedures is no guarantee of conceptual understanding, particularly when it comes to complexites such as dividing fractions. It’s clear to most sensible observers that knowing math facts, procedures and concepts is  what counts when it comes to mastering mathematics.

mathtimestableimageSimply ignoring research that contradicts your ‘meta-beliefs’ is common on the Math Education battlefield. Recent academic research on “memorization” that contradicts Boaler and her entourage, is simply ignored, even that emanating from her own university. Two years ago, Shaozheng Qin and Vinod Menon of Stanford University Medical School led a team that provided scientifically-validated evidence that “rote memorization” plays a critical role in building capacity to solve complex calculations.

Based upon a clinical study of 68 children, aged 7 to 9, studied over the course of one year, their 2014 Nature Neuroscience study, Qin, Menon et al. found that memorizing the answers to simple math problems, such as basic addition or multiplication, forms a key step in a child’s cognitive development, helping bridge the gap between counting on fingers and tackling more complex calculations. Memorizing the basics, they concluded, is the gateway to activating the “hippocampus,” a key brain structure for memory, which gradually expands in “overlapping waves” to accommodate the greater demands of more complex math.

The whole debate over memorization is suspect because of the imprecision in the use of the term. Practice, drilling, and memorization are not the same, even though they get conflated in Jo Boaler’s work and in much of the current Mathematics Education literature. Back in July 2012, D.T. Willingham made this crucial point and provided some valuable points of distinction. “Practice,” as defined by Anders Ericsson, involves performing tasks and feedback on that performance, executed for the purpose of improvement. “Drilling’ connotes repetition for the purpose of achieving automaticity, which – at its worst, amounts to mindless repetition or parroting. “Memorization,” on the other hand, relates to the goal of something ending up in long-term memory with ready access, but does not imply using any particular method to achieve that goal.

Memorization has become a dirty word in teaching and learning laden with so much baggage to the point where it conjures up mental pictures of “drill and kill” in the classroom. The 2016 PISA Study appears to perpetuate such stereotyping and, worst of all, completely misses the “positive correlation” between teacher-directed or explicit instruction and better performance in mathematics.

Why does the PISA Study tend to associate memorization in home-study settings with the drudgery of drill in the classroom?  To what extent does the PISA Study on Mathematics Teaching support the claims made by Jo Boaler and her ‘Discovery Math’ advocates? When it comes to assessing the most effective teaching methods, why did the PISA researchers essentially take a pass? 

 

Read Full Post »