Feeds:
Posts
Comments

Archive for the ‘Mathematics Education’ Category

Surveying the education public as well as ‘stakeholders’ for their opinion is the latest trend in Canadian K-12 education policy. Two recent Canadian education surveys conducted in Nova Scotia and Alberta provide some recent examples worthy of further discussion.  The recent release of Alberta Education Minister David Eggen’s curriculum survey results (April 13, 2017) also demonstrates that unsuspecting citizens may need help in penetrating through the official spin to get at the actual results.

Facing deep divisions in P-12 education over future directions, and not inclined to follow the research evidence, provincial authorities are going, more and more, to soliciting public opinion utilizing surveys with pre-determined outcomes.  Upon closed scrutiny, the Alberta survey seems wholly designed to confirm intended curriculum directions.

Conducting public surveys is not without its risks. In the case of the 2014 Nova Scotia Education Review survey, a largely unvarnished, no-holds-barred instrument actually backfired on the Education Department. When the N.S. Review Committee headed by Myra Freeman polled 18,500 residents, the results published in October 2014 proved a real jolt and sent the provincial teachers’ union into a tizzy, mostly focused on being excluded from shaping the survey and serving on the commission.

One half of Nova Scotians, the survey found, were “not satisfied with the public school system” and teachers as well as parents identified plenty of reasons why. The report, Disrupting the Status Quo, generated very high expectations — never honoured — that major reform was on the way.  A three-month NSTU teacher work-to-rule in 2016-17 effectively sunk the quality education reform plan and generated a completely new set of teacher-driven demands for improvement in “working conditions.”

Alberta Education had no desire to see that pattern repeated.  Minister Eggen’s curriculum survey looked, and sounded, skewed in the Education Department’s preferred direction – toward more of what is loosely termed “21st learning.” In Alberta Education futuristic doubletalk, the overarching goal is to produce students who “are agents of change to create the globe that they want to be part of.”

The survey, conducted in October and November 2016 succeeded in attracting some 32,390 respondents, of whom only slightly over half (57%) might be classed as ‘outside the system.’The proposed directions were presented as amorphous curriculum “themes” where respondents are clearly led to certain conclusions. You are, for example, asked whether you agree or disagree with this statement: “Through learning outcomes curriculum should support the development of literacy, numeracy and 21st century competencies.”  It is impossible to answer if you think basic numeracy and literacy should take precedence over the ill-defined futuristic skills.

Conducting the survey was also further confirmation of the provincial strategy to thwart mathematics education reform. With the Alberta “Back to Basics” petition, initiated by parent Dr. Nhung Tran-Davies of Calmar, AB, piling up 18,332 signatures, the survey attempts, in clumsy fashion, to override that hardened opinion.

The Department’s summary of responses does its best to conceal the extent of resistance to current K-12 Mathematics teaching and curricula.  Sifting through the Mathematics responses, teaching math facts, restoring step-by-step algorithmic thinking, limiting the use of computers, and mastering mental math far outweighed any preference for “21st century competencies” or its step-child, discovery math.

Instead of addressing these findings, Minister Eggen  ‘cherry-picked’ one example of the desire for ‘relevance’ – support for including financial literacy in grade 4 to 9 classes. That too is a clear sign that parents want their kids to be able to balance a set of sums.

Albertans’ written responses to the open-ended questions are the clearest indication of their true inclinations.  Out of the 15,724 respondents committed enough to do more than tick boxes, the largest segment, again (10 %), favoured refocusing on “math basics” and singled out “discovery math” as a problem. Combined with “learning the basics” (6%) and teaching practical skills (7%), one in four who made comments targeted the lack of rigour in the curriculum.

Judging from the wording of questions, the entire survey also skewed in the direction of student-centred teaching methods. That’s strange because the recent PISA 2015 global results report demonstrated conclusively that “explicit instruction” produced much better student results than “minimally-guided instruction.”

The inherent bias pops up elsewhere. “This survey,” it reported, “was intended to focus on the ‘what’ of current provincial curriculum not ‘how’ teachers teach it.”   Serious curriculum analysts know it’s now virtually impossible to separate the two in assessing program effectiveness.

Provincial education authorities were, at one time, given explicit mandates based upon either firm political policy positions or best practice research. When governments are lost and searching for direction, they may turn to the public to find their bearings. In the case of Alberta, it looks more like surveying for confirmation of the ‘educrats’ own pre-determined direction.

*A condensed version of this Commentary appeared in the Edmonton Journal, April 18, 2017.

Why do school systems survey the public?  Are Canadian provincial governments totally lost on K-12 education and simply looking for direction? Do our Education Department’s  harbour a secret agenda?  Or are they looking for public confirmation of pre-conceived plans for curriculum changes? 

Advertisements

Read Full Post »

Developing a Growth Mindset in students and their teachers is perhaps the hottest trend in the education world outside of Canada. Originating in psychological science research conducted by Carol S. Dweck, starting in the late 1980s , and continuing at Stanford University, it burst upon the education scene in 2006 with the publication of Dweck’s influential book, Mindset: The New Psychology of Success.  The next great thing, growth mindset, became an instant buzzword phrase in many education faculties and professional development sessions.

The so-called Mindset Revolution, like most education fads, has also generated its share of imitations and mutations. Two of the best known are  the Mathematical Mindset, promulgated by Mathematics educator Jo Boaler, and a more recent Canadian spin-off, The Innovator’s Mindset, the brain-child of George Couros, a division principal of  Teaching and Learning with Parkland School District, in Stony Plain, Alberta, Canada. While Growth Mindset 1.0, got little traction in Canada, the second generation iteration dreamed up by Couros is increasingly popular among technology-savvy Canadian and American educators.

CarolDweckBannerLegions of professional educators and teachers in the United States, Britain, and Australia, have latched onto GM theory and practice with a real vengeance. One reliable barometer of ‘trendiness,” the George Lucas Educational Foundation website, Edutopia, provides a steady stream of short vignettes and on-line videos extolling the virtues of GM in the classroom. The growing list of Growth Mindset pieces @Edutopia purport to “support students in believing that they can develop their talents and abilities through hard work, good strategies, and help from others.”

What is the original conception of the Growth Mindset?  Here is how Carol Dweck explained it succinctly in the September 22, 2015 issue of Education Week: “We found that students’ mindsets—how they perceive their abilities—played a key role in their motivation and achievement, and we found that if we changed students’ mindsets, we could boost their achievement. More precisely, students who believed their intelligence could be developed (a growth mindset) outperformed those who believed their intelligence was fixed (a fixed mindset). And when students learned through a structured program that they could “grow their brains” and increase their intellectual abilities, they did better. Finally, we found that having children focus on the process that leads to learning (like hard work or trying new strategies) could foster a growth mindset and its benefits.”

GrowthMindsetModelDweck’s theory of Growth Mindsets gained credibility because, unlike most educational ‘fads,’ it did emerge out of some sound initial research into brain plasticity and was tested in case studies with students in the schools. Leading education researcher Dylan Wiliam, a renowned student assessment expert, lent his support to the Growth Mindset movement when he embraced Dweck’s findings and applied them to building ‘feedback’ into student assessment.  He adopted this equation: Talent = Hard Work + Persistence (A Growth Mindset) and offered this endorsement: “The harder you work, the smarter you get. Once students begin to understand this “growth mindset” as Carol Dweck calls it, students are much more likely to embrace feedback from their teachers.”

Ten years on, cracks appeared in the Growth Mindset movement when some of the liveliest minds in education research began to probe more deeply into the theory, follow-up studies, and the supposed evidence of student success. An early skeptic, Disappoined Idealist, hit a nerve with a brave little commentary, December 5, 2014, wondering whether the Growth Mindset described a world as we wanted it to be, rather than one as it is, and likened it to “telling penguins to flap harder( and they would be able to fly like other birds).  Self-styled ‘education progressives’ have taken their cue from American writer Alfie Kohn who weighed in with a widely-read Salon commentary in which he argued that Dweck’s research had been appropriated by “conservative” educators trying to “fix our kids” when we should be “fixing the system.”

The Growth Mindset ‘magic dust’ is wearing thin in the United Kingdom. British education gadfly David Didau,The Learning Spy, initially “pretty psyched” by Dweck’s theory, has grown increasingly skeptical over the past year or so. In a succession of pointed commentaries, he has punched holes in the assumption that all students possess unlimited “growth potential,” examined why more recent GM interventions have not replicated Dweck’s initial results, questioned whether GM is founded on pseudoscience, and even suggested that the whole theory might be “bollocks.”

Intrepid Belgian education researcher, Pedro De Bruyckere, co-author of Urban Myths About Learning and Education,  has registered his concerns about the validity of research support, citing University of Edinburgh psychologist Timothy Bates’ findings. Based upon case studies with 12-year-olds in China, Bates found no evidence of the dramatic changes in Dweck’s earlier studies: “People with a growth mindset don’t cope any better with failure. If we give them the mindset intervention, it doesn’t make them behave better. Kids with the growth mindset aren’t getting better grades, either before or after our intervention study.”

For much of the past two years, Dweck and her research associate Susan Mackie have been alerting researchers and education policy-makers to the spread of what is termed a false growth mindset” in schools and classrooms in Australia as well as Britain and the United States. Too many teachers and parents, they point out, have either misinterpreted or debased the whole concept, reducing it to simple axioms like “Praise the effort, not the child (or the outcome).” In most cases, it’s educational progressives, or parents, looking for alternatives to “drilling with standardized tests.”

GrowthMindsetFalsityDweck’s greatest fear nowadays is that Growth Mindset has been appropriated by education professionals to reinforce existing student-centred practices and to suit their own purposes. That serious concern is worth repeating: ” It’s the fear that the mindset concepts, which grew up to counter the failed self-esteem movement, will be used to perpetuate that movement.” In a December 2016  interview story in The Altantic, she conceded that it was being used in precisely that way, in too many classrooms, and it amounted to “blanketing everyone with praise, whether deserved or not.”

A “false growth mindset” arises, according to Dweck, when educators use the term too liberally and simply do not really understand that it’s intended to motivate students to work harder and demonstrate more resilience in overcoming setbacks. She puts it this way:  “The growth mindset was intended to help close achievement gaps, not hide them. It is about telling the truth about a student’s current achievement and then, together, doing something about it, helping him or her become smarter.” Far too many growth mindset disciples, Dweck now recognizes, reverted to praising students rather than taking “the long and difficult journey” in the learning process and showing “how hard work, good strategies, and good use of resources lead to better learning.”

One of Dweck’s most prominent champions, Jo Boaler, may be contributing to the misappropriation of Growth Mindset theory in her field.  As an influential Stanford university mathematics education professor, Boaler is best known as an apostle of constructivist approaches to teaching Mathematics in schools. She saw in Dweck’s Growth Mindset theory confirmation that a “fixed mindset” was harmful to kids convinced that they “can’t do Math.” It all fit nicely into her own conception of how children learn Math best – by exploration and discovery in classrooms unleashing childrens’ potential. It became, for Boaler, a means of addressing “inequalities” perpetuated by “ability groupings” in schools. It also served to advance her efforts to “significantly reposition mistakes in mathematics” and replace “crosses” with “gold stars” and whole-class “opportunities for learning.”

The Canadian mutation, George Couros’ The Innovator’s Mindset, seeks to extend Carol Dweck’s original theory into  the realm of technology and creativity. Troubled by the limitations of Dweck’s model and  its emphasis on mastery of knowledge and skills, he made an “awesome” (his word) discovery –that GM could be a powerful leadership tool for advancing “continuous creation.” In his mutation of the theory, the binary “fixed” vs. “growth” model morphs into a more advanced stage, termed the “innovator’s mindset.” In his fertile and creative mind, it is transmogrified into a completely new theory of teaching and learning.

GrowthMinsetCourosModelTaking poetic licence with Dweck’s research-based thesis, Couros spins a completely different interpretation in his fascinating professional blog, The Principal of Change:

As we look at how we see and “do” school, it is important to continuously shift to moving from consumption to creation, engagement to empowerment, and observation to application. It is not that the first replaces the latter, but that we are not settling for the former. A mindset that is simply open to “growth”, will not be enough in a world that is asking for continuous creation of not only products, but ideas. “

Promising educational theories, even those founded on some robust initial research, can fall prey to prominent educators pushing their own ‘pet ideas’ and pedagogical theories. While a 2016 Education Week report demonstrates the GM initiatives produce mixed results and British education researchers are having a field day picking apart Carol Dweck’s research findings, another version of her creation is emerging to make it even harder to assess her serious case studies being replicated around the world.

Which version of Carol Dweck’s Growth Mindset theory and practice are we assessing – the original conception or the “false” conception?  How and why did an educational theory intended to motivate students, instill a work ethic, and help kids overcome obstacles get so debased in translation into classroom practice?  Is the fate of the Growth Mindset indicative of something more troubling in the world of education research? 

 

 

 

 

 

 

Read Full Post »

With the release of the 2015 Program for International Student Assessment (PISA) on the horizon,  the Organization for Economic Cooperation and Development (OECD) Education Office has stoked-up the “Math Wars” with a new study. While the October 2016 report examines a number of key questions related to teaching Mathematics, OECD Education chose to highlight its findings on “memorization,” presumably to dispel perceptions about “classroom drill” and its use in various countries.

mathsubtractionboardThe OECD, which administers the PISA assessments every three years to 15-year-olds from around the globe, periodically publishes reports looking at slices of the data. It’s most October 2016 report,  Ten Questions for Mathematics Teachers and How PISA Can Help Answer Them, based upon the most recent 2012 results, tends to zero-in on “memorization” and attempts to show that high-performing territories, like Shanghai-China, Korea, and Chinese-Taipei, rely less on memory work than lower-performing places like Ireland, the UK, and Australia.

American Mathematics educator Jo Boaler, renowned for “Creative Math,” jumped upon the PISA Study to buttress her case  against “memorization” in elementary classrooms. In a highly contentious November 2016 Scientific American article, Boaler and co-author Pablo Zoido, contended that PISA findings confirmed that “memorizers turned out to be the lowest achievers, and countries with high numbers of them—the U.S. was in the top third—also had the highest proportion of teens doing poorly on the PISA math assessment.” Students who relied on memorization, they further argued, were “approximately half a year behind students who used relational and self-monitoring strategies” such as those in Japan and France. 

Australian education researcher Greg Ashman took a closer look at the PISA Study and called into question such hasty interpretations of the findings.  Figure 1.2: How teachers teach and students learn caught his eye and he went to work interrogating the survey responses on “memorization” and the axes used to present the data.  The PISA analysis, he discovered, also did not include an assessment of how teaching methods might be correlated with PISA scores in Mathematics.  Manitoba Mathematics professor Robert Craigen spotted a giant hole in the PISA analysis and noted that the “memorization” data related to “at-home strategies of students” not their instructional experiences and may wel;l indicate that students who are improperly instructed in class resort to memorization on their own.

mathpisateacherdirectedgraphWhat would it look like, Ashman wondered, if the PISA report had plotted how students performed in relation to the preferred methods used on the continuum from “more student-oriented instruction” to “more teacher-directed instruction.” Breaking down all the data, he generated a new graph that actually showed how teaching method correlated with higher math performance and found a “positive correlation” between teacher-directed instruction and higher Math scores. “Correlations,” he duly noted, “do not necessarily imply causal relationships but clearly a higher ratio of teacher-directed activity to student orientation.”

Jumping on the latest research to seek justification for her own “meta-beliefs” are normal practice for Boaler and her “Discovery Math” education disciples. After junking, once again, the ‘strawmen’ of traditional Mathematics — “rote memorization” and “drill,” Boaler and Zoido wax philosophical and poetic: “If American classrooms begin to present the subject as one of open, visual, creative inquiry, accompanied by growth-mindset messages, more students will engage with math’s real beauty. PISA scores would rise, and, more important, our society could better tap the unlimited mathematical potential of our children.” That’s definitely stretching the evidence far beyond the breaking point.

The “Math Wars” do generate what University of Virginia psychologist Daniel T. Willingham has aptly described as “a fair amount of caricature.” The recent Boaler-Zoido Scientific American article is a prime example of that tendency. Most serious scholars of cognition tend to support the common ground position that learning mathematics requires three distinct types of knowledge: factual, procedural and conceptual. “Factual knowledge,” Willingham points out, “includes having already in memory the answers to a small set of problems of addition, subtraction, multiplication, and division.” While some students can learn Mathematics through invented strategies, it cannot be relied upon for all children. On the other hand, knowledge of procedures is no guarantee of conceptual understanding, particularly when it comes to complexites such as dividing fractions. It’s clear to most sensible observers that knowing math facts, procedures and concepts is  what counts when it comes to mastering mathematics.

mathtimestableimageSimply ignoring research that contradicts your ‘meta-beliefs’ is common on the Math Education battlefield. Recent academic research on “memorization” that contradicts Boaler and her entourage, is simply ignored, even that emanating from her own university. Two years ago, Shaozheng Qin and Vinod Menon of Stanford University Medical School led a team that provided scientifically-validated evidence that “rote memorization” plays a critical role in building capacity to solve complex calculations.

Based upon a clinical study of 68 children, aged 7 to 9, studied over the course of one year, their 2014 Nature Neuroscience study, Qin, Menon et al. found that memorizing the answers to simple math problems, such as basic addition or multiplication, forms a key step in a child’s cognitive development, helping bridge the gap between counting on fingers and tackling more complex calculations. Memorizing the basics, they concluded, is the gateway to activating the “hippocampus,” a key brain structure for memory, which gradually expands in “overlapping waves” to accommodate the greater demands of more complex math.

The whole debate over memorization is suspect because of the imprecision in the use of the term. Practice, drilling, and memorization are not the same, even though they get conflated in Jo Boaler’s work and in much of the current Mathematics Education literature. Back in July 2012, D.T. Willingham made this crucial point and provided some valuable points of distinction. “Practice,” as defined by Anders Ericsson, involves performing tasks and feedback on that performance, executed for the purpose of improvement. “Drilling’ connotes repetition for the purpose of achieving automaticity, which – at its worst, amounts to mindless repetition or parroting. “Memorization,” on the other hand, relates to the goal of something ending up in long-term memory with ready access, but does not imply using any particular method to achieve that goal.

Memorization has become a dirty word in teaching and learning laden with so much baggage to the point where it conjures up mental pictures of “drill and kill” in the classroom. The 2016 PISA Study appears to perpetuate such stereotyping and, worst of all, completely misses the “positive correlation” between teacher-directed or explicit instruction and better performance in mathematics.

Why does the PISA Study tend to associate memorization in home-study settings with the drudgery of drill in the classroom?  To what extent does the PISA Study on Mathematics Teaching support the claims made by Jo Boaler and her ‘Discovery Math’ advocates? When it comes to assessing the most effective teaching methods, why did the PISA researchers essentially take a pass? 

 

Read Full Post »

A lively national conversation is underway in the United States over stalled upward mobility and stark income inequality and it has a more genteel echo in Canada.  Many North American educators point to poverty as the explanation for American students’ mediocre test scores and it also serves as a favoured rationale for explaining away the wide variations in achievement levels among and within Canadian provinces. Only recently have policy analysts, boring down into the PISA 2012 Mathematics data, begun to look at the alarming achievement gap between states and provinces, the relationship between education expenditures and performance levels, and the bunching of students in the mid-range of achievement.

PISA2012CanadaGraphicThe socio-economic determinists offer a simple-minded, mono-causal explanation for chronic student under-performance. American education policy analyst Michael Petrilli and Brandon Wright of The Thomas B. Fordham Institute recently recapped the standard lines: If teachers in struggling U.S. schools taught in Finland, says Finnish educator Pasi Sahlberg, they would flourish—in part because of “support from homes unchallenged by poverty.” Michael Rebell and Jessica Wolff at Columbia University’s Teachers College argue that middling test scores reflect a “poverty crisis” in the United States, not an “education crisis.” Adding union muscle to the argument, American Federation of Teachers president Randi Weingarten calls poverty “the elephant in the room” that accounts for poor student performance.

The best data we have to tackle the critical questions comes from the OECD Program for International Student Assessment (PISA), which just released its annual Education at a Glance 2015 report.  For its own analyses, PISA uses an index of economic, social, and cultural status (ESCS) that considers parental occupation and education, family wealth, home educational resources, and family possessions related to “classical” culture. PISA analysts use the index to stratify each country’s student population into quartiles. That broadens the focus so it’s not just about addressing the under-performance of disadvantaged children.

MathScoresSES2012The PISA socio-economic analysis identifies the key variations among international educational jurisdictions. Countries like Belgium and France are relatively better at teaching their higher-status students, while other countries like Canada and Finland do relatively better at instructing students from lower-status families. Contrary to past assumptions, the United States falls almost exactly on the regression line. It does equally well (or equally poorly, if you prefer) at teaching the least well-off as those coming from families in the top quartile of the ESCS index.

A Fall 2014 Education Next report by Eric Hanushek, Paul Peterson and Ludger Woessmann pointed out the wide variations, country-to-country, in overall Mathematics proficiency.   Some 35 percent of the members of the U.S. class of 2015 (NAEP) reach or exceed the proficiency level in math. Based on their calculations, this percentage places the United States at the 27th rank among the 34 OECD countries. That ranking is somewhat lower for students from advantaged backgrounds (28th) than for those from disadvantaged ones (20th).

Overall assessments of Mathematics proficiency on PISA offer no real surprises. Compared to the U.S., the percentage of students who are math proficient is nearly twice as large in Korea (65%), Japan (59%), and Switzerland (57%). The United States also lags behind Finland (52%), Canada (51%), Germany (50%), Australia (45%), France (42%), and the United Kingdom (41%). Within the U.S., the range is phenomenal – from a high of 51% in Massachusetts to a low of 19 % in Mississippi.

Cross-national comparisons are misleading, because Canadian students have plateaued on the PISA tests over the past decade.  While Canada was still among the high-level achievers, performance of the country’s 15-year-olds in mathematics has declined, with a 14-point dip in the past nine years. While performance in reading has remained relatively stable, the decline in science performance was “statistically significant,” dipping from an average of 534 in 2006 and 529 in 2009.

MathPISA2012RangesMuch like the United States, Canada exhibits significant variations from one provincial school system to another.  A 2013 Canadian Council of Ministers of Education Canada (CMEC) review of the OECD PISA 2012 Mathematics performance levels revealed the stark achievement inequalities. Four Canadian provinces set the pace – Quebec, British Columbia, and Ontario – and the remaining six are a drag on our average scores. Fully 25% of Prince Edward Island students score Below Level 2, below the OECD average (23%), in Mathematics proficiency. The other provinces with the next highest levels of under-performers were: Manitoba (21%), Newfoundland/Labrador(21%), Nova Scotia (18%), and New Brunswick (16%).

There is no case for complacency in Canada, as pointed out, repeatedly, by Dr. Paul Cappon, former CEO of the Canadian Council on Learning (2005-2011) and our leading expert on comparative international standards. For a “high-achieving” country, Canada has a lower proportion of students who perform at the highest levels of Mathematics on recent PISA tests (CMEC 2013, Figure 1.3, p. 25).  Canada’s 15-year-olds are  increasingly bunched in the mid-range and, when it comes to scoring Level 4 and above on Mathematics,  most score at or below the OECD average of 31 %.  The proportion of high-achievers (Level 4 and above in 2012) was, as follows: PEI (22%); Newfoundland/Labrador (27%); Nova Scotia (28%); Manitoba (28%); Saskatchewan (33%); and Ontario (36%). Mathematics students from Quebec continue to be an exception because 48% of students continue to score Level 4 and above, 17 points above the OECD average score.

Students coming from families with high education levels also tend to do well on the PISA Mathematics tests. The top five OECD countries in this category are Korea (73%), Poland (71%), Japan (68%)Germany (64%) and Switzerland (65%), marginally ahead of the state of Massachusetts at 62%. Five other American states have high-achievement level proficiency rates of 58% or 59%, comparable to Czech Republic (58%) and higher than Canada (57%) and Finland (56%). Canada ranked 12th on this measure, well behind Korea, Poland, Japan, Switzerland and Germany.

Educators professing to be “progressive” in outlook tend to insist that we must cure poverty before we can raise the standards of student performance. More pragmatic educators tend to claim that Canadian schools are doing fine, except for the schools serving our disadvantaged populations, particularly Indigenous and Black children.  Taking a broad, international perspective, it appears that both assumptions are questionable. There are really two achievement gaps to be bridged – one between the affluent/advantaged and the poor/disadvantaged and the other one between Canadian high achievers and their counterparts in the top PISA performing countries.

Does low Socio-Economic Status (SES) marked by child and family poverty set the pattern for student achievement in a deterministic fashion?  To what extent can and do students break out of that mold? How critical are other factors such as better teacher quality, higher curriculum standards, and ingrained ethno-cultural attitudes? Do school systems like Canada and Finland tend to focus on reducing educational inequalities at the expense of challenging their high achievers?  Is this the real reason that many leading western G21 countries continue to lag behind those in Asia? 

Read Full Post »

Today the Organization for Economic Development and Cooperation (OECD) has succeeded in establishing the Program of International Student Assessment (PISA) test and national rankings as the “gold standard” in international education. Once every three years since 2000, PISA provides us with a global benchmark of where students 15 years of age rank in three core competencies — reading, mathematics, and science. Since its inception, United States educators have never been enamoured with international testing, in large part because American students rarely fare very well.

PISATestVisualSo, when the infamous OECD PISA Letter was published in early May 2014 in The Guardian and later The Washington Post, the academics and activists listed among the initial signatory list contained the names of some familiar American anti-testing crusaders, such as Heintz-Deiter Meyer (SUNY, Albany), David Berliner (Arizona State University), Mark Naison (BAT, Fordham University), Noam Chomsky (MIT) and Alfie Kohn, the irrepressible education gadfly. That letter, addressed to Andreas Schleicher, OECD, Paris, registered serious concerns about “the negative consequences of the PISA rankings” and appealed for a one cycle (three-year) delay in the further implementation of the tests.

The global campaign to discredit PISA earned a stiff rebuke in Canada. On June 11 and June 18, 2014, the C.D. Howe Institute released two short commentaries demonstrating the significant value of PISA test results and effectively countering the appeal of the anti-PISA Letter. Written by Education Fellow John Richards the two-part report highlighted the “Bad News” in Canada’s PISA Results and then proceeded to identify What Works (specific lessons to be learned) based upon an in-depth analysis of the once every three-year tests. In clear, understandable language, Richards identified four key findings to guide policies formulated to “put Canadian students back on track.”

The call for a pause in the PISA tests was clearly an attempt to derail the whole international movement to establish benchmarks of student performance and some standard of accountability for student achievement levels in over 60 countries around the world. It was mainly driven by American anti-testers, but the two Canadian-based signatories were radical, anti-colonialist academics, Henry Giroux (English and Cultural Studies, McMaster University) and Arlo Kempf ( Visiting Professor, Program Coordinator, School and Society, OISE).

Leading Canadian educationists like Dr. Paul Cappon (former CEO, Council on Learning) and even School Change guru Michael Fullan remain supporters of comparative international student assessments. That explains why no one of any real standing or clout from Canada was among the initial group, and, by late June, only 32 Canadian educationists could be found among the 1988 signatories from all over the globe. Most of the home-grown signatories were well known educators in what might be termed the “accountability-free” camp, many like E. Wayne Ross (UBC) and Marc Spooner (U Regina), fierce opponents of “neo-liberalism” and its supposed handmaiden, student testing.

John Richards’ recent C.D.Howe commentaries should, at least temporarily, silence the vocal band of Canadian anti-testers.  His first commentary made very effective use of PISA student results to bore deeply into our key strengths and issues of concern, province-by-province, focusing particularly on student competencies in mathematics. That comparative analysis is fair, judicious, and research-based in sharp contrast to the honey-coated PISA studies regularly offered up by the Council of Ministers of Education (Canada).

The PISA results tell the story. While he finds Canadian students overall “doing reasonably well,”  the main concern is statistical declines in all provinces in at least one subject, usually either mathematics or reading.  Quebec leads in Mathematics, but in no other subject.  Two provinces (PEI and Manitoba) experienced significant declines in all three subject areas. Performance levels have sharply declined ) over 30 points) in mathematics in both Manitoba and Canada’s former leader, Alberta. Such results are not a ringing endorsement of the Mathematics curriculum based upon the Western and Northern Canada Protocol (WNCP). 

The warning signs are, by now, well known, but the real value in Richards’ PISA Results analysis lies in his very precise explanation of the actual lessons to be learned by educators.  What really matters, based upon PISA results, are public access to early learning programs, posting of school-level student achievement results, paying professional level teacher salaries, and the competition provided by achievement-oriented private and  independent (not for profit) schools. Most significantly, his analysis confirms that smaller class sizes (below 20 pupils per class) and increasing mathematics teaching time have a negligible effect on student performance results.

The C.D. Howe PISA Results analysis hit home with The Globe and Mail, drawing a favourable editorial, but was predictably ignored by the established gatekeepers of Canada’s provincial education systems. Why the reluctance to confront such research-based, common sense findings?  “Outing” the chronic under-performance of students from certain provinces ( PEI, Manitoba, New Brunswick, and Nova Scotia) is taboo, particularly inside the tight CMEC community and within the self-referenced Canadian Education Association (CEA) circles.  For the current Chair of CMEC, Alberta Education Minister Jeff Johnson any public talk of Alberta’s precipitous decline in Mathematics is an anathema.

Stung by the PISA warning shots, Canada’s provincial education gatekeepers tend to be less receptive to sound, research-based, practical policy correctives. That is a shame because the John Richards reports demonstrate that both “sides” in the ongoing  Education War are half-right and by mixing and matching we could fashion a much more viable, sustainable, effective policy agenda. Let’s tear up the existing and tiresome Neo-Con vs. Anti-Testing formulas — and re-frame education reform around what works – broader access to early learning, open accountability for student performance levels, paying respectable, professional-level teacher salaries, and welcoming useful competition from performance-driven private and independent schools.

What’s the  recent American Public Noise over “PISAfication” all about anyway?  Why do so many North American educators still tend to dismiss the PISA Test and the sound, research-based studies stemming from the international testing movement?  To what extent do John Richards’ recent C.D. Howe Institute studies suggest the need for a total realignment of provincial education reform initiatives?

 

 

Read Full Post »

Teaching all children Mathematics may well be possible. That’s the inspiring lesson delivered by Dr. John Mighton at an April 24 Public Lecture, sponsored by the Mount Saint Vincent Faculty of Education, and attended by 150 curious educators and concerned parents.  He is the founder of JUMP (Junior Undiscovered Math Prodigies), a Toronto-based charitable organization that seeks to “multiply the potential in children” and to instill in them the joy of truly mastering mathematics.

MightonJUMPMathMighton is an incredibly talented mathematician on a mission.  Founded as a kitchen-table tutoring group in 1998, JUMP Math is presently challenging  the prevailing math education “discovery math”  ideology  embraced by North American curriculum consultants and reinforced in textbooks and online resources published by giant learning industry multinationals, Pearson and Oxford/Nelson. Since June of 2013, JUMP Math is breaking out with new adoptions in Manitoba, Calgary, and Vancouver where teachers are looking to significantly improve elementary level student math performance.

The founder of JUMP Math shot to prominence in 2003 with the publication of his book, The Myth of Ability.  Leading mathematicians like Dr. Robert Dawson, Editor of the Canadian Mathematical Society Notes, sat up and took notice.  In the Newsletter, he compared Mighton to the classroom teacher Jaime Escalante in the inspiring feature film, Stand and Deliver.  Both educators, he noted, embraced the idea that mathematics was “something that everybody can learn to do.”  His book, he added, “may be a big step in that direction.”

The Mathematics Education Wars are fought on contested pedagogical terrain and Mighton’s JUMP Math is emerging as a logical and welcome middle ground. In his recent lectures, he makes a persuasive case for a “balanced’ approach, starting with fundamentals and then empowering students to engage in creative problem-solving activities. He’s clear in explaining the limitations of both “drill and fill” traditional teaching and “fuzzy Math” promoted by romantic progressives.

“Students must be empowered to succeed” is his consistent message.  Beginning math instruction is broken down into tiny and carefully-structured chunks, that any student, working with any teacher, can learn thoroughly.  It’s teacher-guided but also exploratory and provides elementary students with the scaffolding needed to possess the knowledge and skills to eventually tackle creative problem-solving.  “Teachers are my heroes,” he says, because they are the ones who have driven the spread of JUMP Math, not the math consultants.

Canadians tend to be slow to embrace their own heroes and seek validation of their talents elsewhere. Mighton holds a Ph.D. in mathematics from the University of Toronto, completed NSERC  postdoctoral research in knot and graph theory, teaches Mathematics at U of T, and in 2010 was appointed an Officer of the Order of Canada. He’s also a playwright and script writer, known in Hollywood for his star turn in the feature film Good Will Hunting.

Mighton’s JUMP Math has evolved significantly over the past decade and now boasts supportive classroom effectiveness research, including studies at Toronto Sick Kids Hospital. in Lambeth, UK, and at the Mabin School.  While he was once “the nation’s math conscience,” Manitoba Education Minister James Allum now sees his approach as giving that province an edge over provinces like Alberta, wedded to the standard Western and Northern Canada Protocol (WNCAP) curriculum and continuing with “less successful methods”.

What’s standing in the way of Mathematics education reform?  Two key factors jump out as the obvious explanation – the established “Discovery Learning” ideology and the preponderant influence of its proponents, the late Richard Dunne (1944-2012), creator of Maths Makes Sense, and his Canadian counterpart, Dr. Marian Small, purveyor of Nelson mathematics problem-solving books.  They are a formidable force backed by the Pearson and Oxford/Nelson publishing conglomerates and a small army of textbook author replicators here in Canada.

Richard Dunne and his Canadian camp followers talk about mathematics but their real agenda is to promote a “whole school approach” to discovery learning.   His distinctive teaching style,  initiated at Reading Boys’ Grammar School in the late 1960s, uses concrete “manipulatives” to help kids understand math concepts.  Based upon his theories rather than research, Dunne cut a plastic cup into 10 pieces to demonstrate the meaning of decimals and then developed other dramatic demonstration techniques to introduce children to abstract ideas.

Dunne was a teacher and math consultant rather than a mathematician.  His earlier version of Maths Makes Sense published in the 1980s proved popular with teachers who were non-specialists, but was resisted by many university based mathematicians and then rejected by the British Government in 1989 with the introduction of a more rigorous National Curriculum. Panned in the U.K., his teaching methods enjoyed greater popularity in North America and his version of “Discovery Math”  made a comeback in 2007 with the re-publication of Maths Makes Sense.

Dunne’s “whole school approach” was embraced by North American math consultants education schools seeking to promote “discovery learning” in all subject areas.  Secondary school mathematics specialists remained skeptical and most stayed true to traditional methods, but Discovery Math made deep inroads among regular elementary teachers, often with little or no mathematics training.  It achieved the height of its influence in Canada when the WNCP Math curriculum spread across the provinces, supported by the Pearson Canada Math Makes Sense series of books and online resources.

Declining Mathematics achievement levels from 2003 to 2012, on PISA and Canadian national tests, began to raise red flags.  A WISE Math movement, sparked by Winnipeg math professors Anna Stokke and Robert Craigen, demonstrated the direct relationship between declining scores and the spread of  Dunne-inspired WNCP curricula.  In September 2013, Manitoba re-introduced Math fundamentals and approved JUMP Math for use in the schools.  Over the past year, the number of students studying JUMP Math has jumped from 90,000 to 110,000 as more and more schools are breaking with the entrenched Discovery Math methods and adopting a more systematic, teacher-guided, step-by-step progression in their teaching of early mathematics.

What’s standing in the way of Math correction in North American elementary schools?  Why has the “total school approach” made such inroads in the teaching of Mathematics in the early grades?  Can all or the vast majority of students be taught Mathematics? Will Dr. John Mighton eventually be vindicated for promoting fundamental building blocks?  Which of the Canadian provinces will be next in abandoning the core philosophy of the Discovery Math/WNCP curriculum?

Read Full Post »

The ‘Big Test’ has hit us and rocked our education world. The sliding math scores of Canadian 15 year-olds outside Quebec have just captured all the headlines and a series of PISA news stories and commentaries identified the “discovery learning” approach to teaching mathematics as the source of the recent, and continuing decline. Columnist Konrad Yakabuski , a close observer of the American education wars, saw the declining math scores as a “damaging legacy” of discovery learning. We are falling backward, he claimed, in both excellence and equity raising the fundamental question – “Has the education elite learned its lesson?”

PISAMathKidsIn the OECD’s 2012 Programme for International Assessment (PISA) rankings released December 3, 2013, , Canada  dropped out of the top 10 in student mathematics scores, a decline that raised alarms about the country’s future prosperity. Canadian students placed 13th overall in mathematics, down three spots from 2009 and six spots from 2006, in the highly anticipated test conducted every three years and which measures how 15-year-olds around the world are doing in math, reading and science. Canada ranked behind many Asian economies, including Shanghai (China), Singapore, Korea and Japan, while the United States lagged far behind and 36th out of 65 participating countries.

PISA12RankingsThe PISA test jolt comes on the heels of declining math scores nationally and a surprisingly poor showing from youth on a recent OECD literacy and numeracy test. The Canadian math curriculum, ushered in over the past decade, catching the blame for lower scores for good reason.  Curricula like the Western and Northern Canada Protocol (WNCP) is out-of-sync with high performing Asian countries because it  places far more emphasis on real-world concepts  than on abstract thinking, standard algorithms, and practice. The accompanying OECD report, in fact,  noted that the top performers had more exposure to formal mathematics than word problems. That may explain why Shanghai students topped the rankings and performed three grade levels above those of most other nations.

Topping the PISA student performance rankings attracts international acclaim, school system imitators, and increasingly scarce public education dollars. Once reviled by Canadian anti-testing advocates, the PISA test results are –oddly enough –what provides the ammunition for much of what now passes for informed debate over quality, equity, and accountability in Canada’s provincial school systems. They also bred a certain Canadian complacency until the recent release of the 2012 student results.

National and provincial reputations now ride on the PISA results. From 2000 to 2006, the PISA test results catapulted Finland’s education system to star status, and that ‘Finnish infatuation,’ essentially swept the Canadian educational establishment off its feet, blinding us to the Quebec’s success in mathematics and Ontario’s progress in improving reading and closing the socio-economic education gap.

Between 2000 and 2009, Canada plateaued in overall student performance and Canadian students posted a 10 per cent decline in reading scores. This week’s PISA results confirm that 15-year-old Canadian students, with the execution of those in Quebec, are losing ground, particularly in mathematics.

The rise and fall of Alberta, Canada’s former top performing province, contains a few valuable lessons. Two decades ago, Alberta was the first province to really confront the global learning gap, forecasting that, if trends continued, Albertan and Canadian students were going to be left behind.  

Dr. Joe Freedman, a Red Deer radiologist, and Andrew Nikiforuk, a Calgary-based Globe and Mail columnist, raised the first alarm bells and founded Albertans for Quality Education.  In 1991, they convinced the Alberta Chamber of Resources (ACR) and the Conference Board of Canada to produce a truly ground-breaking study,  International Comparisons in Education, comparing Alberta math and science curriculum with that in Japan, Germany and Hungary.

Alberta’s mathematics and science curriculum was then virtually re-written and bench-marked against that of the top performing nations. Under Education Minister Jim Dinning, the province built its rock solid reputation on raising standards, student testing, school choice and charter schools.

While Alberta ranked first on the PISA tests and topped the Pan-Canadian Assessment Programme (PCAP) tests in literacy and science for most of two decades, it has slipped precipitously since 2006. Adopting the WNCP math curriculum with its “discovery learning” focus and the Finnish infatuation have been key factors in the decline.

The ‘Finnish solution’ began to lose its lustre after the 2009 PISA test when Finland saw its reading scores drop by 11 per cent. Outside of Canada, education policy analysts have now become far more enamoured with Asian school systems like Shanghai and Korea.

None of this seems to matter to Canadian ‘progressives,’ sponsoring a Canadian tour for Finnish education expert Pasi Sahlberg, promoting Finland as the “Global Fourth Way,” and seeking to curtail standardized testing. They are bent on turning back the dreaded “GERM,” the Global Education Reform Movement, supposedly carrying the plague of “neo-liberalism” and its principal strains — higher standards, school choice, and competition in public education.

The Alberta Teachers Association (ATA), armed with a 2012 report written by Sahlberg’s North American ally, Andy Hargreaves, now talks of “transforming Alberta education” with “The Fourth Way, “ and is out to dismantle provincial testing, curtail expanded classroom learning time, and block teacher assessment tied to student performance. More recently, the Finnish wave of “personalized learning” has reached British Columbia.

Finland, like Canada, got a jolt from the 2012 PISA test results. That will finally prompt education observers to acknowledge that Finnish education is fuzzy on standards.  It is, after all,  light on standardized testing, soft on homework, and promotes a “culture of trust” instead of accountability.

Looking deeper, Finland is also a “one provider” system with little or no choice for parents, delays the start of school until age 7, and streams students after Grade 9  into two tracks, academic and vocational, based upon arbitrary average-mark cut-offs.

The Canadian attraction to “discovery learning” and the rush to abandon standardized testing have both hit a significant bump in the road. In the wake of the 2012 PISA results, Canadians are awakening to the dangers of turning back the clock to the days of ‘accountability-free’ public education. Without PISA and the OECD follow-up research studies we are left almost completely in the dark on critical educational quality issues that matter for students and our public schools.

What are the powerful lessons of Canada’s recent decline in PISA test scores?  When will Canadian mathematics educators face reality and come to accept the need to develop a more rigorous, soundly-based curriculum providing a solid grounding in the fundamental skills?  Will Canada come to accept the need to stop being what Paul Cappon aptly termed “a school that never issues report cards”?  And finally, is the real message sinking in?

Read Full Post »

Older Posts »