Feeds:
Posts
Comments

Archive for the ‘Mathematics Education’ Category

The most recent April 2021 Fraser Institute report on Mathematics performance of students across Canada contained very few surprises. Students from Quebec continue to be at the head of the class. On the benchmark Program of International Student Assessment (PISA) from 2003 to 2018, they scored the highest (532 in 2018), 20 points above the Canadian average, and continued to outpace those of any other province. Steep declines have been registered by students from Alberta (- 38 points), British Columbia (-34 points), and Saskatchewan (- 31 points). Students from two Maritime provinces, Nova Scotia and New Brunswick, have steadily declined and now hover around the OECD mean score of 494. 

MathStruggleBoyatBoard

Most interesting to analyze is New Brunswick because it exemplifies why Canadian students produce such mediocre results. With PISA scores dropping from 511 (2003) to 491 (2018),  New Brunswick 15-year-old students perform well below the national mean scores on a “steadily negative ” trajectory over the past fifteen years. On the past three PISA tests, 2012 to 2018, their scores have declined by 2.2 per cent, third worst among the provinces. The national Grade 8 PCAP results for 2010 to 2016 while below the national mean do show slight improvement, albeit on assessments keyed to provincial curriculum standards.  What jumps out at you in the report, however, is the row of blanks for provincial math assessments in New Brunswick and the statement “insufficient data to estimate trends.”

Assessing student capabilities in mathematics should, one would think, be a provincial priority when there’s plenty of evidence that students are still struggling in math. The clearest example of this, confirmed in interviews with math tutors over the past two weeks, is that most N.B. students today are so lacking in basic computational skills that they cannot complete secondary school math placement tests without a calculator.

Calculator dependence is now widespread in New Brunswick schools and its most telling impact is in the lagging Mathematics achievement of students.  The use of calculators in North American math classrooms has been common since the 1980s, but top performing nations, such as Singapore, China and Korea, put far more emphasis on integrating mental computation with conceptual understanding before progressing to higher-level math and problem solving. That approach is also reflected in the most successful after-school math tutoring programs such as Kumon Math and the Toronto-based alternative, the Spirit of Math, widely used in Ontario independent schools.

Provincial school officials do not generally react to periodic reports that students are struggling in mathematics, pointing to rising teacher-assigned student grades and healthy graduation rates. Those in the ‘shadow school system’ of private tutoring and the math assessment offices of universities and colleges have no such inhibitions. Most are alarmed at what they see and learn while conducting intake assessments of prospective students. Most perform one or two grades below expected levels and, moving upwards through the grades, wide variations appear in students’ skill levels and competencies.

‘Discovery Math’ is the prevailing teaching approach in the vast majority of N.B. elementary schools and the tutors insist it’s not working for far too many students. “Most students have gaps in their skills,” says Rhonda Connell, manager of Fredericton’s Kumon Math and Reading operation with 28-years of tutoring experience. “The N.B. curriculum is not skills-based, but rather more exploratory of different methods.”

What’s wrong with that approach?  “Students in public schools without basic skills get taught long and complicated operations and the kids get lost,” Connell tells me. “They don’t know their mental math and that’s why high school students simply cannot do the Kumon placement test without a calculator.”

The mathematics deficits grow as students progress from elementary grades into high school. “There’s a widening gap,” says Connell. She finds that students do not know their fractions, cannot do long division or basic subtraction and borrowing operations. The bottom line: “Students don’t have the skills at hand to engage in problem-solving and higher-level math.”

The founder of Mathnasium in Moncton, Jocelyn Chan, saw through the eyes of her son, now 7-years-of age, that mathematics education was sadly lacking. As a CPA with plenty of corporate finance experience, she decided to do something about it by opening the first Mathnasium franchise operation in Atlantic Canada. Since opening in October 2020, it’s grown from 4 or 5 students to 70 enrolments today with a majority of students in Grades 5 and 6 where the math deficits become more pronounced and visible to parents.

The pandemic shutdowns and default to hybrid learning have set students back, particularly in a more teacher-dependent subject like mathematics. “A lot of Moncton area students were already behind to begin with,” Chan says, ‘so the learning loss is more acute.” “Lots of Grade 9s this year are struggling,” she notes, “because of COVID-19 causing them to lose half of their grade 8 year, leaving them unprepared for the next grade.”

Private tutoring after-school programs such as Kumon and Mathnasium both cater to upwardly, mobile, affluent families with the financial resources to afford such programs. Out of 331 Kumon operations in Canada, there’s only one in New Brunswick.  While the Fredericton Kumon centre run by Connell has grown steadily from some 30 to 40 students in 1993 to 141 students today, that’s still a small fraction of the total student population.

Many of the new clients also turn out to be newcomers, recently arrived in the province. Most local parents, according to Connell and Chan, only become concerned when they see their children falling behind or getting lower grades. “People moving here from elsewhere,” Connell notes, “expect more” and “come to Kumon saying that there’s nothing going on in the schools.”

Unaddressed math problems surface again when students proceed on to university and find themselves in popular programs like management, marketing, or economics where some math skills are required to master the core content.  Many turn to mathematics and language remediation programs.

Senior Math instructor C. Hope Alderson is on the front-lines as coordinator of the UNB- SJ Flora Beckett Mathematics and Science Help Centre. As a mathematics tutor, she spends most of her time building the skills and confidence of students struggling in their university courses. Choosing her words carefully, Dr. Hope Alderson confirms what private after-school tutors say about today’s students. “Student have quite an attachment to the calculator,” is how she puts it. “There’s certainly less emphasis on mental computations in today’s schools. They grab the calculator to do simple calculations.”

The pandemic is not helping the situation. Faced with stay-at-home orders, students and families were left with online remedial programs or strictly-limited in-person, socially-distanced tutoring. Enrollment in Kumon Fredericton peaked in 2019, just before the school shutdown.  Since then, home learning and family stresses have kept families away from Kumon.  “Family stresses ran high,” says Connell, “and it had an effect on students’ abilities to focus on their math.” Separation from their social group was especially hard on teenage students.

Mastery of basic math skills is being sadly neglected in our K-12 schools. Conceptual understanding should not be emphasized to the virtual exclusion of mental computation skills. Getting a calculator to do the mathematics for you contributed to the entrenched problem.

*An earlier version of this commentary appeared in The Telegraph-Journal, provincial edition, In New Brunswick.

Why are Canadian students losing ground in Mathematics on the benchmark PISA tests administered every three years?  What can we learn from a case study looking at the state of math competencies in New Brunswick? Is it a combination of factors?  If so, what needs to be done to address the underperformance of our students on international assessments?  

Read Full Post »

Ontario’s Mathematics program for Kindergarten to Grade 12 has just undergone a significant revision in the wake of the continuing decline in student performance in recent years. On June 24, 2020, Education Minister Stephen Lecce unveiled the new mathematics curriculum for elementary school students with a promised emphasis on the development of basic concepts and fundamental skills. In a seemingly contradictory move, the Minister also announced that the government was cancelling next year’s EQAO testing in Grades 3 and 6 to give students and teachers a chance to get used to the new curriculum.

While the Doug Ford Government was elected in June 2018 on a “Back to the Basics” education pledge, the new mathematics curriculum falls considerably short of that commitment. While the phrase “back to the basics” adorned the media release, the actual public message to parents and the public put more emphasis on providing children with practical skills. Financial literacy will be taught at every grade level and all students will learn coding or computer programming skills, starting in Grade 1 in Ontario schools. A more detailed analysis of the actual math curriculum changes reveals a few modest steps toward reaffirming fundamental computation skills, but all cast within a framework emphasizing the teaching of “social-emotional learning skills.” 

The prevailing “Discovery Math” philosophy enshrined in the 2005 Ontario curriculum may no longer be officially sanctioned, but it remains entrenched in current teaching practice. Simply issuing provincial curriculum mandates will not change that unless teachers themselves take ownership of the curriculum changes. Cutting the number of learning outcomes for Grades 1 to 8 down to 465 “expectations” of learning, some 150 fewer than back in 2005, will be welcomed, especially if it leads to greater mastery of fewer outcomes in the early grades.

The parents’ guide to the new math curriculum, released with the policy document, undercuts the “back to basics” commitment and tilts in a different direction. The most significant revamp is not the reintroduction of times tables, teaching fractions earlier on, or emphasizing the mastery of standard algorithms. It is the introduction of a completely new “strand” with the descriptor “social-emotional learning skills.” That new piece is supposedly designed to help students “develop confidence, cope with challenges, and think critically.” It also embodies the ‘discovery learning‘ approach of encouraging students to “use strategies” and “be resourceful” in “working through challenging problems.”

Ontario’s most influential mathematics curriculum consultants, bracing for the worst, were quick to seize upon the unexpected gift.  Assistant professor of math education at the Ontario Institute for Studies in Education (OISE), Mary Reid, widely known for supporting the 2005 curriculum philosophy, identified the “social-emotional learning” component as “critically important” because it would “help kids tremendously.” That reaction was to be expected because Reid’s research focuses on “math anxiety” and building student confidence through social-emotional learning skills development.

Long-time advocates for higher math standards such as Math teacher Barry Garelick and Ottawa parent Clive Packer saw the recommended approach echoing the prevailing ‘discovery math’ ideology.  Expecting to see a clear statement endorsing mastering the fundamentals and building confidence through enhanced competencies, they encountered documents guiding teachers, once again, toward “making math engaging, fun and interesting for kids.” The whole notion that today’s math teachers utilizing traditional methods stress “rote memorization” and teach kids to “follow procedure without understanding why” is completely bogus. Such caricatures essentially foreclose on serious discussion about what works in the math classroom.

How does the new Ontario math curriculum compare with the former 2005 curriculum?  Identifying a few key components allows us to spot the similarities and differences:

Structure and Content:

  • New curriculum: “clear connections show how math skills build from year to year,” consistent for English-language and french-language learners.
  • Former 2005 curriculum: Difficult to make connections from year-to-year, and inconsistencies in expectations for English-speaking and French-speaking learners.

Multiplication and division:

  • Grade 3, new curriculum: “recall and demonstrate multiplication facts of 2, 5, and 10, and related division facts.” In graduated steps, students learn multiplication facts, starting with 0 X 0 to 12 X 12 to “enhance problem solving and mental math.”
  • Grade 3, 2005 curriculum: “multiply to 7 x 7 and divide to 49 ÷ 7, using a variety of mental strategies (e.g., doubles, doubles plus another set, skip counting) No explicit requirement to teach multiplication tables.

Fractions:

  • Grade 1, new curriculum: “introduced to the idea of fractions, through the context of sharing things equally.”
  • Grade 1, 2005 curriculum: Vague reference – “introducing the concept of equality using only concrete materials.”

Measurement of angles:

  • Grade 6, new curriculum: “use a protractor to measure and construct angles up to 360°, and state the relationship between angles that are measured clockwise and those that are measured counterclockwise.”
  • Grade 6, 2005 curriculum: “measure and construct angles up to 180° using a protractor, and classify them as acute, right, obtuse, or straight angles.”

Graphing data:

  • Grade 8, new curriculum: “select from among a variety of graphs, including scatter plots, the type of graph best suited to represent various sets of data; display the data in the graphs with proper sources, titles, and labels, and appropriate scales; and justify their choice of graphs “
  • Grade 8, 2005 curriculum: “select an appropriate type of graph to represent a set of data, graph the data using technology, and justify the choice of graph”

Improvements in the 2020 Math curriculum are incremental at best likely insufficient to make a significant difference. Providing students with effective instruction in mathematics is, after all, what ultimately leads to confidence, motivation, engagement, and critical thinking. Starting with confidence-building exercises gets it all backwards. Elementary mathematics teachers will be guided, first, to developing social and emotional learning (SEL) skills:  (1) identify and manage emotions; (2) recognize sources of stress  and cope with challenges; (3) maintain positive motivation and perseverance; (4) build relationships and communicate effectively; (5) develop self-awareness and sense of identity; (6) think critically and creatively. Upon closer scrutiny these are generic skills which are not only problematic but also entirely unmeasurable.

The fundamental question raised by the new Ontario math curriculum reform is whether it is equal to the task of improving stagnating student test scores. Student results in English-language schools in Grade 3 and Grade 6 mathematics, on EQAO tests, slid consistently from 2012 to 2018. Back in 2012, 68 % of Grade 3 students met provincial standards; in 2018, the mean score dropped to 58 %.  In Grade 6 mathematics, it was worse, plummeting from 58 % to 48% meeting provincial standards. On international tests, Ontario’s Program of International Student Assessment (PISA) Math scores peaked in 2003 at 530 and dropped in 2013 to 509, then recovered slightly in 2018 to 514, consistent with the provincial slide (See Graph – Greg Ashman). Tinkering with math outcomes and clinging to ineffective “mathematical processes” will likely not be enough to change that trajectory.

Building self-esteem and investing resources in more social and emotional learning (SEL) is not enough to turn-around student math achievement. Yet reviewing the new mathematics curriculum, the Ontario curriculum designers seem to have lost their way. It all looks strangely disconnected from the supposed goal of the reform — to raise provincial math standards and improve student performance on provincial, national, and international assessments.

What’s the real purpose of the new Ontario mathematics curriculum reform?  Does the latest curriculum revision reflect the 2018 commitment to move forward with fundamentals or is it a thinly-disguised attempt to integrate social and emotional learning into the program?  Where is the evidence, in the proposed curriculum, that Ontario education authorities are laser focused on improving math standards? Will this latest reform make much of a difference for students looking for a bigger challenge or struggling in math? 

Read Full Post »

Student achievement varies a great deal across the Organization of Economic Cooperation and Development (OECD) countries. Good teachers can have a significant impact upon their students’ learning and achievement and there is now research to support that contention.  What makes some teachers more effective than others is less clear.  It remains one question that cries out for further in-depth study.

A comprehensive research study reported in the latest issue of Education Next (Vol. 19, Spring 2019) tackles that fundamental question on an international comparative scale. Three American researchers, Eric A Hanushek, Marc Piopiunik, and Simon Wiederhold, not only demonstrate that teachers’ cognitive skills vary widely among developed nations, but that such differences matter greatly for student performance in school.

Developing, recruiting and training a teacher force with higher cognitive skills (Hanushek, Piopiunik, Wiederhold 2019) can be critical in improving student learning. “An increase of one standard deviation in teacher cognitive skills,” they claim, “is associated with an increase of 10 to 15 per cent of a standard deviation in student performance.” Comparing reading and math scores in 31 OECD countries, teachers in Finland come out with the highest cognitive skills. One quarter of the gaps in average student performance across countries would be closed if each of them were to raise the level of teachers’ cognitive skills to that of Finland.

What’s most fascinating about this study is the large role Canadian teachers play in the comparative data analysis for teacher cognitive skills.  Of the 6,402 teacher test-takers in 31 countries, the largest group, 834 (13 per cent), were from Canada. Based upon data gleaned from the OECD Program for the International Assessment of Adult Competencies (PIAAC), we now know where Canadian teachers rank in terms of their numeracy and literacy skills (See Figure 1). We also have a clearer indication of how Canadians with Bachelor’s degrees and Master’s or Doctoral degrees rate in terms of their core cognitive skills.

Teachers from Canada fare reasonably well, in the top third, in the comparative analysis of cognitive skills. In literacy, teachers in Canada perform above average, with a median score of 308 points out of 500 compared to the sample-wide average of 295 points.  If there’s a problem, it’s in terms of numeracy skills, where they perform slightly above the teacher-wide sample with a median score of 293, compared to the average of 292 points. Adult Canadians with Bachelor’s degrees actually outperform teachers in numeracy skills by 7 points. Teachers in Finland and Japan, for example, perform better than Canadians with Master’s or Doctoral degrees.

Since the September 2010 appearance of  the McKInsey & Company study “Closing the talent gap,,” American policy-makers have considered teachers’ own academic performance as “a key predictor” of higher student achievement, based upon teacher recruitment practices in countries that perform well on international tests. High scoring countries like Singapore, Finland and Korea, for example, recruit their teacher force from the top third of their academic cohorts in university.

Securing sound data on the actual quality of recent Canadian teacher education cohorts is challenging because of the paucity of reported information. One claim that Canadian teachers come from the “top one third of high school graduates” put forward in a 2010 McKinsey & Company OECD study looks highly suspect.

A September 2008 review of Initial Teacher Education Programs  (Gambhir, Evans, Broad, Gaskell 2008), reported that admission cut-offs ranged from 65 per cent to over 90 per cent, depending upon the faculty of education. Most of the Canadian universities with Faculty of Education programs, to cite another fact, still have grade cut-off averages for acceptance in the Arts and Science that hover between 70 per cent and 75 per cent. With the exception of OISE, Western, Queen’s and UBC, teacher candidates are not drawn from the top third of their academic cohort, particularly in mathematics and sciences.

Differences in teachers’ cognitive skills within a country also seem to have a bearing upon student performance. Plotting student performance difference between math and reading ( at the country level) against the difference in teacher cognitive skills between numeracy and literacy yields some intriguing results (Figure 2). An increase of teacher cognitive skills of one standard deviation is estimated to improve student achievement by 11 per cent of standard deviation. The data for Canada shows a teacher test-score difference between numeracy and literacy of -12 points

The brand new American study (Hanushek, Piopiunik, Wiederhold 2019) also demonstrates that paying teachers better is a possible factor in attracting and retaining teachers with higher cognitive skills. In terms of wage premiums, teachers’ earnings in higher performing countries are generally higher, as borne out by Ireland, Germany and Korea, where teachers earn 30 to 45 per cent more than comparable college graduates in other jobs.

Teachers in Canada earn 17 per cent more than their comparators, while those in the USA and Sweden earn 22 per cent less. Increasing teacher pay has potential value in the United States where salaries discourage the ‘best and brightest’ from entering teaching. There is a caveat, noted by Hanushek and his research team:  Changes in policy must ensure that “higher salaries go to more effective teachers.”

Do smarter teachers make for smarter students? How sound is the evidence that teachers who know more are actually better teachers? Why do we put so much stock in improving student learning in literacy/reading and mathematics?  What potential flaws can you spot in this type of research? 

 

Read Full Post »

Quebec students head the class when it comes to mathematics. On the Pan-Canadian Assessment Program (PCAP) tests of Grade 8 students, written in June 2016 and released in early May 2018, those from Quebec finished first in Mathematics (541), forty points above the Canadian mean score of 511 and a gain of 26 points over the past six years.

The latest national results solidified Quebec’s position as our national leader in mathematics achievement on every comparative test over the past thirty years. How and why Quebec students continue to dominate and, in effect, pull up Canada’s international math rankings deserves far more public discussion. Every time math results are announced, it generates a flurry of interest, but it does not appear to have encouraged other provinces to try to emulate that success.

Since the first International Assessment of Educational Progress (IEAP) assessment back in 1988 and in the next four national and international mathematics tests up to 2000, Quebec’s students generally outperformed students from other Canadian provinces at grades four, eight and eleven. That pattern has continued right up to the present and demonstrated impressively on the most recent Program of International Student Assessment (PISA 2015) where Quebec 15-year-olds scored 544, ranking among the world’s  top education jurisdictions.

One enterprising venture, launched in 2000 by the B.C. Ministry of Education under Deputy Minister Charles Ungerleider, did tackle the question by comparing British Columbia’s and Quebec’s mathematics curricula. That comparative research project identified significant curricular differences between the two provinces, but the resulting B.C. reform initiative ran aground on what University of Victoria researchers Helen Raptis and Laurie Baxter aptly described as the “jagged shores of top-down educational reform.”

Over the past thirty years, the reasons for Quebec dominance in K-12 mathematics performance are coming into sharper relief. The B.C. Ministry of Education 2000 research project exposed and explained the curricular and pedagogical factors and subject specialists, including both university mathematics specialists and mathematics education professors, have gradually filled in the missing pieces. Mathematics education faculty with experience in Quebec and elsewhere help to complete the picture.

Five major factors can now be identified to explain why Quebec students continue to lead the pack in pan-Canadian mathematics achievement:

  1. Clearer Curriculum Philosophy and Sequence:

The scope and sequence of the math curriculum is clearer, demonstrating an acceptance of the need for integration and progression of skills. The 1980 Quebec Ministry of Education curriculum set the pattern.  Much more emphasis in teacher education and in the classroom was placed upon building sound foundations before progressing to problem-solving. Curriculum guidelines were much more explicit about making connections with previously learned material.

Quebec’s Grade 4 curriculum made explicit reference to the ability to develop speed and accuracy in mental and written calculation and to multiply larger numbers as well as to perform reverse operations. By grade 11, students were required to summon “all their knowledge (algebra, geometry, statistics and the sciences) and all the means at their disposal…to solve problems.” “The way math is presented makes the difference,” says Genevieve Boulet, Mount St. Vincent University Mathematics education professor with prior experience preparing mathematics teachers at Université de Sherbrooke.

  1. Superior Math Curriculum

Fewer topics tend to be covered at each grade level, but in more depth than in B.C. and other Canadian provinces. In Grade 4, students are generally introduced, right away, to Numbers/Operations and the curriculum unit on measurement focuses on mastering three topics– length, area, and volume  — instead of a smattering of six or seven topics. Concrete manipulations are more widely used to facilitate comprehension of more abstract math concepts. Much heavier emphasis is placed on Numbers/Operations as Grade 4 students are expected to perform addition, subtraction, and multiplication using fractions. Secondary school in Quebec begins in Grade 7 (Secondaire I) and ends in Grade 11 (Secondaire V) and, given the organizational model, that means students are more likely to be taught by mathematics subject specialists. Quebec’s Grade 11 graduation courses, Mathematics 536 (Advanced), Mathematics 526 (Transitional) and Mathematics 514 (Basic), were once quite different, offering the same range of topics but covered to a different depth. More recently, Quebec has revamped its mathematics program, and now offers three streamed courses, designated 565 Science Option, 564 Technical and Science Option, and 563 Cultural, Social, Technical and Science Option.

  1. More Extensive Teacher Training

Teacher preparation programs in Quebec universities are 4-years long, providing students with double the amount of time to master mathematics as part of their teaching repertoire, a particular advantage for elementary teachers. In Quebec faculties of education, elementary school math teachers must take as many as 225 hours of university courses in math education; in some provinces, the instructional time  averages around 40 hours.

Teacher-guided or didactic instruction has been one of the Quebec teaching program’s strengths. Annie Savard, a McGill University education professor, points out that Quebec teachers have a clearer understanding of ‘didactic’ instruction, a concept championed in France and French-speaking countries. They are taught to differentiate between teaching and learning. “Knowing the content of the course isn’t enough, “ Savard says. “You need what we call didactic [teaching]. You need to unpack the content to make it accessible to students.”

Teacher pedagogy in mathematics makes a difference. Outside of Quebec, the dominant pedagogy is child-centred and heavily influenced by Jean Piaget and behaviorist theories of learning. Prospective teachers are encouraged to use ‘discovery learning’ and to respond to stimuli by applying the appropriate operations. In Quebec, problem-solving is integrated throughout the curriculum rather than treated as a separate topic. Shorter teacher training programs, according to Boulet, shortchange teacher candidates and can adversely affect their preparedness for the classroom. Four-year programs afford education professors more time to expose teacher candidates to the latest research on cognitive psychology which challenges the efficacy of child-centred approaches to the subject.

  1. Secondary School Examinations

Students in Quebec still write provincial examinations and achieving a pass in mathematics is a requirement to secure a graduation (Secondaire V) diploma.  Back in 1992, Quebec mathematics examinations were a core component of a very extensive set of ministry examinations, numbering two dozen, and administered in Grades 9 (Sec III), Grade 10 (Sec IV), and Grade 11 (Sec V).  Since 2011-12, most Canadian provinces, except Quebec, have moved, province by province, to either eliminate Grade 12 graduation examinations, reduce their weighting, or make them optional. In the case of B.C., the Grade 12 provincial was cancelled in 2012-13 and in Alberta the equivalent examination now carries a much reduced weighting in final grades.  In June of 2018, Quebec continues to have final provincial exams, albeit fewer and more limited to Mathematics and the two languages. Retaining exams has a way of keeping students focused to the end of the year; removing them has been linked to both grade inflation and the lowering of standards.

  1. Preparedness Philosophy and Graduation Rates

Academic achievement in mathematics has remained a system-wide priority and there is much less emphasis in Quebec on pushing every student through to high school graduation. From 1980 to the early 2000s, the Quebec mathematics curricula was explicitly designed to prepare students for mastery of the subject, either to “prepare for further study” or to instill a “mathematical way of thinking” – reflecting the focus on subject matter.  The comparable B.C. curriculum for 1987, for example, stated that mathematics was aimed at enabling students to “function in the workplace.”   Already, by the 1980s, the teaching of B.C. mathematics was seen to encompass sound reasoning, problem-solving ability, communications skills, and the use of technology.  Curriculum fragmentation, driven by educators’ desires to meet individual student needs, never really came to dominate the Quebec secondary mathematics program.

Quebec’s education system remains that of ‘a province unlike the others.’  While the province sets the pace in mathematics achievement, a February 2018 report demonstrated that it lags significantly behind the others in graduation rates. Comparing Quebec’s education system with that of Ontario, Education Minister Sebastien Proulx points out, is “like comparing apples to oranges.”  The passing grade in Quebec courses is 60 per cent compared to 50 per cent in Ontario and the requirements for a graduation diploma are more demanding because of the final examinations. When the passing grade was raised in 1986-87, ministry official Robert Maheu noted, the decision was made to firm up school standards. Student achievement indicators, particularly in mathematics, drove education policy and, until recently, unlike other provinces, student preparedness remained a higher priority than raising graduation rates.

Quebec Math and the Rest – Vive le Difference

School systems are, after all, not always interchangeable, and context is critical in assessing student outcomes. As David F. Robitaille and Robert A. Garden’s 1989 IEA Study reminded us, systems are “in part a product of the histories, national psyches, and societal aspirations” of the societies in which they develop and reside. While British Columbia and the other English-speaking provinces have all been greatly influenced by American educational theorists, most notably John Dewey and the progressives, Quebec is markedly different. Immersed in a French educational milieu, the Quebec mathematics curriculum has been, and continues to be, more driven by mastery of subject knowledge, didactic pedagogy, and a more focused, less fragmented approach to student intellectual development.

Socio-historical and cultural factors weigh heavily in explaining why Quebec continues to set the pace in Mathematics achievement. Challenging curricula and final examinations produces higher math scores, but it also contributes to lower graduation rates.

* A revised version of this post was published October 22, 2018 by the IRPP magazine, Policy Options.

Read Full Post »

A recent New York Times commentary by American engineering professor Barbara Oakley has, once again, stirred up much public debate focused on the critical need for “Math practice” and why current “Discovery Math” methodologies are hurting students, and especially girls. “You and your daughter can have fun throwing eggs off a building and making paper-mache volcanoes, “ she wrote,but the only way to create a full set of options for her in STEM is to ensure that she has a solid foundation in math.”  Mathematics is “the language of science, engineering and technology,” Oakley reminded us. And like any language, she claimed, it is “best acquired through lengthy, in-depth practice.”

That widely-circulated commentary was merely the latest in a series of academic articles, policy papers, and education blog posts to take issue with the prevailing ideology in North American Mathematics education, championed by Professor Jo Boaler of Stanford University’s School of Education and her disciples.  Teaching the basics, explicit instruction, and deliberate practice are all, in Boaler’s view, examples of “bad math education” that contribute to “hating Math” among children and “Math phobia” among the populace. Her theories, promulgated in books and on the “YouCubed” education website, make the case that teaching the times tables and practicing “multiplication” are detrimental, discovering math through experimentation is vital, and making mistakes is part of learning the subject.

Boaler has emerged in recent years as the leading edu-guru in Mathematics education with a wide following, especially among elementary math teachers. Under the former Ontario Kathleen Wynne government, Boaler served as a prominent, highly visible member of the Math Knowledge Network (MKN) Advisory Council charged with advancing the well-funded Math Renewal Strategy.” Newsletters generated by the MKN as part of MRS Ontario featured inspirational passages from Jo Boaler exhorting teachers to adopt ‘fun’ strategies and to be sensitive to “student well-being.”

While Boaler was promoting her “Mathematics Mindset” theories, serious questions were being raised about the thoroughness of her research, the accuracy of her resources, and the legitimacy of her claims about what works in the Math classroom. Dr. Boaler had successfully weathered a significant challenge to her scholarly research by three Stanford mathematics professors who found fault with her “Railside School” study. Now she was facing scrutiny directed at YouCubed by cognitive science professor Yana Weinstein and New York Math teacher Michael Pershan.  Glaring errors were identified in YouCubed learning materials and the research basis for claims made in “Mistakes Grow Your Brain” seriously called into question. The underlying neuroscience research by Jason S Moser and his associates does not demonstrate the concept of “brain sparks” or that the “brain grows” from mistakes, but rather that people learn when made aware of their mistakes. 

Leading researchers and teachers associated with researchED are in the forefront of the current wave of evidence-based criticism of Boaler’s theories and contentions.  Australian teacher-researcher Greg Ashman, author of The Truth About Teaching (2018), was prompted by Jo Boaler’s response to the new UK math curriculum including “multiplication practice” to critically examine her claims. “Memorizing ‘times tables,’ “she told TES, was “terrible.” “I have never memorised my times tables,” she said. “I still have not memorised my times tables. It has never held me back, even though I work with maths every day.”  Then for clarification:” “It is not terrible to remember maths facts; what is terrible is sending kids away to memorise them and giving them tests on them which will set up this maths anxiety.”  

Ashman flatly rejected Boaler’s claims on the basis of the latest cognitive research. His response tapped into “cognitive load ” research and it bears repeating: “Knowing maths facts such as times tables is incredibly useful in mathematics. When we solve problems, we have to use our working memory which is extremely limited and can only cope with processing a few items at a time. If we know our tables then when can simply draw on these answers from our long term memory when required. If we do not then we have to use our limited working memory to figure them out when required, leaving less processing power for the rest of the problem and causing ‘cognitive overload’; an unpleasant feeling of frustration that is far from motivating.”

British teachers supportive of the new Math curriculum are now weighing-in and picking holes in Boaler’s theories. One outspoken Math educator, “The Quirky Teacher,” posted a detailed critique explaining why Boaler was “wrong about math facts and timed tests.” Delving deeply into the published research, she provided evidence from studies and her own experience to demonstrate that ‘learning maths facts off by heart and the use of timed tests are actually beneficial to every aspect of mathematical competency (not just procedural fluency).” “Children who don’t know their math facts end up confused,” she noted, while those who do are far more likely to become “better, and therefore more confident and happy, mathematicians.”

Next up was University of  Pennsylvania professor Paul L. Morgan, Research Director of his university’s Center for Educational Disabilities. Popular claims by Boaler and her followers that “math practice and drilling” stifle creativity and interfere with “understanding mathematical concepts” were, in his view, ill-founded. Routine practice and drilling through explicit instruction, Morgan contended in Psychology Today, would “help students do better in math, particularly those who are already struggling in elementary school.”  Based upon research into Grade 1 math achievement involving 13,000 U.S. students, his team found that, of all possible strategies, “only teacher-directed instruction consistently predicted greater first grade achievement in mathematics.”

Critiques of Jo Boaler’s theories and teaching resources spark immediate responses from the reigning Math guru and her legions of classroom teacher followers. One of her Stanford Graduate Education students, Emma Gargroetzi, a PhD candidate in education equity studies and curator of Soulscrutiny Blog, rallied to her defense following Barbara Oakley’s New York Times piece.  It did so by citing most of the “Discovery Math” research produced by Boaler and her research associates. She sounded stunned when Oakley used the space as an opportunity to present conflicting research and to further her graduate education.

Some of the impassioned response is actually sparked by Boaler’s own social media exhortations. In the wake of the firestorm, Boaler posted this rather revealing tweet: “If you are not getting pushback, you are probably not being disruptive enough.” It was vintage Boaler — a Mathematics educator whose favourite slogan is “Viva la Revolution.”  In the case of Canadian education, it is really more about defending the status quo against a new generation of more ‘research-informed’ teachers and parents.

Far too much Canadian public discourse on Mathematics curriculum and teaching simply perpetuates the competing stereotypes and narratives. Continued resistance to John Mighton and his JUMP Math program is indicative of the continuing influence wielded by Boaler and her camp. Doug Ford’s Progressive Conservative Government is out to restore “Math fundamentals” and determined to break the curriculum gridlock.  The recent debate over Ontario Math education reform on Steve Paikin’s TVOntario program The Agenda featured the usual competing claims, covered familiar ground, and suggested that evidence-based discussion has not yet arrived in Canada.

What explains Professor Jo Boaler’s success in promoting her Math theories and influencing Math curriculum renewal over the past decade? How much of it is related to YouCubed teaching resources and the alignment with Carol Dweck’s ‘growth mindset’ framework? Do Boaler’s theories on Math teaching work in the classroom? What impact, if any, have such approaches had on the decline of Math achievement in Ontario and elsewhere?  When will the latest research on cognitive learning find its way to Canada and begin to inform curriculum reform?

 

 

Read Full Post »

Surveying the education public as well as ‘stakeholders’ for their opinion is the latest trend in Canadian K-12 education policy. Two recent Canadian education surveys conducted in Nova Scotia and Alberta provide some recent examples worthy of further discussion.  The recent release of Alberta Education Minister David Eggen’s curriculum survey results (April 13, 2017) also demonstrates that unsuspecting citizens may need help in penetrating through the official spin to get at the actual results.

Facing deep divisions in P-12 education over future directions, and not inclined to follow the research evidence, provincial authorities are going, more and more, to soliciting public opinion utilizing surveys with pre-determined outcomes.  Upon closed scrutiny, the Alberta survey seems wholly designed to confirm intended curriculum directions.

Conducting public surveys is not without its risks. In the case of the 2014 Nova Scotia Education Review survey, a largely unvarnished, no-holds-barred instrument actually backfired on the Education Department. When the N.S. Review Committee headed by Myra Freeman polled 18,500 residents, the results published in October 2014 proved a real jolt and sent the provincial teachers’ union into a tizzy, mostly focused on being excluded from shaping the survey and serving on the commission.

One half of Nova Scotians, the survey found, were “not satisfied with the public school system” and teachers as well as parents identified plenty of reasons why. The report, Disrupting the Status Quo, generated very high expectations — never honoured — that major reform was on the way.  A three-month NSTU teacher work-to-rule in 2016-17 effectively sunk the quality education reform plan and generated a completely new set of teacher-driven demands for improvement in “working conditions.”

Alberta Education had no desire to see that pattern repeated.  Minister Eggen’s curriculum survey looked, and sounded, skewed in the Education Department’s preferred direction – toward more of what is loosely termed “21st learning.” In Alberta Education futuristic doubletalk, the overarching goal is to produce students who “are agents of change to create the globe that they want to be part of.”

The survey, conducted in October and November 2016 succeeded in attracting some 32,390 respondents, of whom only slightly over half (57%) might be classed as ‘outside the system.’The proposed directions were presented as amorphous curriculum “themes” where respondents are clearly led to certain conclusions. You are, for example, asked whether you agree or disagree with this statement: “Through learning outcomes curriculum should support the development of literacy, numeracy and 21st century competencies.”  It is impossible to answer if you think basic numeracy and literacy should take precedence over the ill-defined futuristic skills.

Conducting the survey was also further confirmation of the provincial strategy to thwart mathematics education reform. With the Alberta “Back to Basics” petition, initiated by parent Dr. Nhung Tran-Davies of Calmar, AB, piling up 18,332 signatures, the survey attempts, in clumsy fashion, to override that hardened opinion.

The Department’s summary of responses does its best to conceal the extent of resistance to current K-12 Mathematics teaching and curricula.  Sifting through the Mathematics responses, teaching math facts, restoring step-by-step algorithmic thinking, limiting the use of computers, and mastering mental math far outweighed any preference for “21st century competencies” or its step-child, discovery math.

Instead of addressing these findings, Minister Eggen  ‘cherry-picked’ one example of the desire for ‘relevance’ – support for including financial literacy in grade 4 to 9 classes. That too is a clear sign that parents want their kids to be able to balance a set of sums.

Albertans’ written responses to the open-ended questions are the clearest indication of their true inclinations.  Out of the 15,724 respondents committed enough to do more than tick boxes, the largest segment, again (10 %), favoured refocusing on “math basics” and singled out “discovery math” as a problem. Combined with “learning the basics” (6%) and teaching practical skills (7%), one in four who made comments targeted the lack of rigour in the curriculum.

Judging from the wording of questions, the entire survey also skewed in the direction of student-centred teaching methods. That’s strange because the recent PISA 2015 global results report demonstrated conclusively that “explicit instruction” produced much better student results than “minimally-guided instruction.”

The inherent bias pops up elsewhere. “This survey,” it reported, “was intended to focus on the ‘what’ of current provincial curriculum not ‘how’ teachers teach it.”   Serious curriculum analysts know it’s now virtually impossible to separate the two in assessing program effectiveness.

Provincial education authorities were, at one time, given explicit mandates based upon either firm political policy positions or best practice research. When governments are lost and searching for direction, they may turn to the public to find their bearings. In the case of Alberta, it looks more like surveying for confirmation of the ‘educrats’ own pre-determined direction.

*A condensed version of this Commentary appeared in the Edmonton Journal, April 18, 2017.

Why do school systems survey the public?  Are Canadian provincial governments totally lost on K-12 education and simply looking for direction? Do our Education Department’s  harbour a secret agenda?  Or are they looking for public confirmation of pre-conceived plans for curriculum changes? 

Read Full Post »

Developing a Growth Mindset in students and their teachers is perhaps the hottest trend in the education world outside of Canada. Originating in psychological science research conducted by Carol S. Dweck, starting in the late 1980s , and continuing at Stanford University, it burst upon the education scene in 2006 with the publication of Dweck’s influential book, Mindset: The New Psychology of Success.  The next great thing, growth mindset, became an instant buzzword phrase in many education faculties and professional development sessions.

The so-called Mindset Revolution, like most education fads, has also generated its share of imitations and mutations. Two of the best known are  the Mathematical Mindset, promulgated by Mathematics educator Jo Boaler, and a more recent Canadian spin-off, The Innovator’s Mindset, the brain-child of George Couros, a division principal of  Teaching and Learning with Parkland School District, in Stony Plain, Alberta, Canada. While Growth Mindset 1.0, got little traction in Canada, the second generation iteration dreamed up by Couros is increasingly popular among technology-savvy Canadian and American educators.

CarolDweckBannerLegions of professional educators and teachers in the United States, Britain, and Australia, have latched onto GM theory and practice with a real vengeance. One reliable barometer of ‘trendiness,” the George Lucas Educational Foundation website, Edutopia, provides a steady stream of short vignettes and on-line videos extolling the virtues of GM in the classroom. The growing list of Growth Mindset pieces @Edutopia purport to “support students in believing that they can develop their talents and abilities through hard work, good strategies, and help from others.”

What is the original conception of the Growth Mindset?  Here is how Carol Dweck explained it succinctly in the September 22, 2015 issue of Education Week: “We found that students’ mindsets—how they perceive their abilities—played a key role in their motivation and achievement, and we found that if we changed students’ mindsets, we could boost their achievement. More precisely, students who believed their intelligence could be developed (a growth mindset) outperformed those who believed their intelligence was fixed (a fixed mindset). And when students learned through a structured program that they could “grow their brains” and increase their intellectual abilities, they did better. Finally, we found that having children focus on the process that leads to learning (like hard work or trying new strategies) could foster a growth mindset and its benefits.”

GrowthMindsetModelDweck’s theory of Growth Mindsets gained credibility because, unlike most educational ‘fads,’ it did emerge out of some sound initial research into brain plasticity and was tested in case studies with students in the schools. Leading education researcher Dylan Wiliam, a renowned student assessment expert, lent his support to the Growth Mindset movement when he embraced Dweck’s findings and applied them to building ‘feedback’ into student assessment.  He adopted this equation: Talent = Hard Work + Persistence (A Growth Mindset) and offered this endorsement: “The harder you work, the smarter you get. Once students begin to understand this “growth mindset” as Carol Dweck calls it, students are much more likely to embrace feedback from their teachers.”

Ten years on, cracks appeared in the Growth Mindset movement when some of the liveliest minds in education research began to probe more deeply into the theory, follow-up studies, and the supposed evidence of student success. An early skeptic, Disappoined Idealist, hit a nerve with a brave little commentary, December 5, 2014, wondering whether the Growth Mindset described a world as we wanted it to be, rather than one as it is, and likened it to “telling penguins to flap harder( and they would be able to fly like other birds).  Self-styled ‘education progressives’ have taken their cue from American writer Alfie Kohn who weighed in with a widely-read Salon commentary in which he argued that Dweck’s research had been appropriated by “conservative” educators trying to “fix our kids” when we should be “fixing the system.”

The Growth Mindset ‘magic dust’ is wearing thin in the United Kingdom. British education gadfly David Didau,The Learning Spy, initially “pretty psyched” by Dweck’s theory, has grown increasingly skeptical over the past year or so. In a succession of pointed commentaries, he has punched holes in the assumption that all students possess unlimited “growth potential,” examined why more recent GM interventions have not replicated Dweck’s initial results, questioned whether GM is founded on pseudoscience, and even suggested that the whole theory might be “bollocks.”

Intrepid Belgian education researcher, Pedro De Bruyckere, co-author of Urban Myths About Learning and Education,  has registered his concerns about the validity of research support, citing University of Edinburgh psychologist Timothy Bates’ findings. Based upon case studies with 12-year-olds in China, Bates found no evidence of the dramatic changes in Dweck’s earlier studies: “People with a growth mindset don’t cope any better with failure. If we give them the mindset intervention, it doesn’t make them behave better. Kids with the growth mindset aren’t getting better grades, either before or after our intervention study.”

For much of the past two years, Dweck and her research associate Susan Mackie have been alerting researchers and education policy-makers to the spread of what is termed a false growth mindset” in schools and classrooms in Australia as well as Britain and the United States. Too many teachers and parents, they point out, have either misinterpreted or debased the whole concept, reducing it to simple axioms like “Praise the effort, not the child (or the outcome).” In most cases, it’s educational progressives, or parents, looking for alternatives to “drilling with standardized tests.”

GrowthMindsetFalsityDweck’s greatest fear nowadays is that Growth Mindset has been appropriated by education professionals to reinforce existing student-centred practices and to suit their own purposes. That serious concern is worth repeating: ” It’s the fear that the mindset concepts, which grew up to counter the failed self-esteem movement, will be used to perpetuate that movement.” In a December 2016  interview story in The Altantic, she conceded that it was being used in precisely that way, in too many classrooms, and it amounted to “blanketing everyone with praise, whether deserved or not.”

A “false growth mindset” arises, according to Dweck, when educators use the term too liberally and simply do not really understand that it’s intended to motivate students to work harder and demonstrate more resilience in overcoming setbacks. She puts it this way:  “The growth mindset was intended to help close achievement gaps, not hide them. It is about telling the truth about a student’s current achievement and then, together, doing something about it, helping him or her become smarter.” Far too many growth mindset disciples, Dweck now recognizes, reverted to praising students rather than taking “the long and difficult journey” in the learning process and showing “how hard work, good strategies, and good use of resources lead to better learning.”

One of Dweck’s most prominent champions, Jo Boaler, may be contributing to the misappropriation of Growth Mindset theory in her field.  As an influential Stanford university mathematics education professor, Boaler is best known as an apostle of constructivist approaches to teaching Mathematics in schools. She saw in Dweck’s Growth Mindset theory confirmation that a “fixed mindset” was harmful to kids convinced that they “can’t do Math.” It all fit nicely into her own conception of how children learn Math best – by exploration and discovery in classrooms unleashing childrens’ potential. It became, for Boaler, a means of addressing “inequalities” perpetuated by “ability groupings” in schools. It also served to advance her efforts to “significantly reposition mistakes in mathematics” and replace “crosses” with “gold stars” and whole-class “opportunities for learning.”

The Canadian mutation, George Couros’ The Innovator’s Mindset, seeks to extend Carol Dweck’s original theory into  the realm of technology and creativity. Troubled by the limitations of Dweck’s model and  its emphasis on mastery of knowledge and skills, he made an “awesome” (his word) discovery –that GM could be a powerful leadership tool for advancing “continuous creation.” In his mutation of the theory, the binary “fixed” vs. “growth” model morphs into a more advanced stage, termed the “innovator’s mindset.” In his fertile and creative mind, it is transmogrified into a completely new theory of teaching and learning.

GrowthMinsetCourosModelTaking poetic licence with Dweck’s research-based thesis, Couros spins a completely different interpretation in his fascinating professional blog, The Principal of Change:

As we look at how we see and “do” school, it is important to continuously shift to moving from consumption to creation, engagement to empowerment, and observation to application. It is not that the first replaces the latter, but that we are not settling for the former. A mindset that is simply open to “growth”, will not be enough in a world that is asking for continuous creation of not only products, but ideas. “

Promising educational theories, even those founded on some robust initial research, can fall prey to prominent educators pushing their own ‘pet ideas’ and pedagogical theories. While a 2016 Education Week report demonstrates the GM initiatives produce mixed results and British education researchers are having a field day picking apart Carol Dweck’s research findings, another version of her creation is emerging to make it even harder to assess her serious case studies being replicated around the world.

Which version of Carol Dweck’s Growth Mindset theory and practice are we assessing – the original conception or the “false” conception?  How and why did an educational theory intended to motivate students, instill a work ethic, and help kids overcome obstacles get so debased in translation into classroom practice?  Is the fate of the Growth Mindset indicative of something more troubling in the world of education research? 

 

 

 

 

 

 

Read Full Post »

With the release of the 2015 Program for International Student Assessment (PISA) on the horizon,  the Organization for Economic Cooperation and Development (OECD) Education Office has stoked-up the “Math Wars” with a new study. While the October 2016 report examines a number of key questions related to teaching Mathematics, OECD Education chose to highlight its findings on “memorization,” presumably to dispel perceptions about “classroom drill” and its use in various countries.

mathsubtractionboardThe OECD, which administers the PISA assessments every three years to 15-year-olds from around the globe, periodically publishes reports looking at slices of the data. It’s most October 2016 report,  Ten Questions for Mathematics Teachers and How PISA Can Help Answer Them, based upon the most recent 2012 results, tends to zero-in on “memorization” and attempts to show that high-performing territories, like Shanghai-China, Korea, and Chinese-Taipei, rely less on memory work than lower-performing places like Ireland, the UK, and Australia.

American Mathematics educator Jo Boaler, renowned for “Creative Math,” jumped upon the PISA Study to buttress her case  against “memorization” in elementary classrooms. In a highly contentious November 2016 Scientific American article, Boaler and co-author Pablo Zoido, contended that PISA findings confirmed that “memorizers turned out to be the lowest achievers, and countries with high numbers of them—the U.S. was in the top third—also had the highest proportion of teens doing poorly on the PISA math assessment.” Students who relied on memorization, they further argued, were “approximately half a year behind students who used relational and self-monitoring strategies” such as those in Japan and France. 

Australian education researcher Greg Ashman took a closer look at the PISA Study and called into question such hasty interpretations of the findings.  Figure 1.2: How teachers teach and students learn caught his eye and he went to work interrogating the survey responses on “memorization” and the axes used to present the data.  The PISA analysis, he discovered, also did not include an assessment of how teaching methods might be correlated with PISA scores in Mathematics.  Manitoba Mathematics professor Robert Craigen spotted a giant hole in the PISA analysis and noted that the “memorization” data related to “at-home strategies of students” not their instructional experiences and may wel;l indicate that students who are improperly instructed in class resort to memorization on their own.

mathpisateacherdirectedgraphWhat would it look like, Ashman wondered, if the PISA report had plotted how students performed in relation to the preferred methods used on the continuum from “more student-oriented instruction” to “more teacher-directed instruction.” Breaking down all the data, he generated a new graph that actually showed how teaching method correlated with higher math performance and found a “positive correlation” between teacher-directed instruction and higher Math scores. “Correlations,” he duly noted, “do not necessarily imply causal relationships but clearly a higher ratio of teacher-directed activity to student orientation.”

Jumping on the latest research to seek justification for her own “meta-beliefs” are normal practice for Boaler and her “Discovery Math” education disciples. After junking, once again, the ‘strawmen’ of traditional Mathematics — “rote memorization” and “drill,” Boaler and Zoido wax philosophical and poetic: “If American classrooms begin to present the subject as one of open, visual, creative inquiry, accompanied by growth-mindset messages, more students will engage with math’s real beauty. PISA scores would rise, and, more important, our society could better tap the unlimited mathematical potential of our children.” That’s definitely stretching the evidence far beyond the breaking point.

The “Math Wars” do generate what University of Virginia psychologist Daniel T. Willingham has aptly described as “a fair amount of caricature.” The recent Boaler-Zoido Scientific American article is a prime example of that tendency. Most serious scholars of cognition tend to support the common ground position that learning mathematics requires three distinct types of knowledge: factual, procedural and conceptual. “Factual knowledge,” Willingham points out, “includes having already in memory the answers to a small set of problems of addition, subtraction, multiplication, and division.” While some students can learn Mathematics through invented strategies, it cannot be relied upon for all children. On the other hand, knowledge of procedures is no guarantee of conceptual understanding, particularly when it comes to complexites such as dividing fractions. It’s clear to most sensible observers that knowing math facts, procedures and concepts is  what counts when it comes to mastering mathematics.

mathtimestableimageSimply ignoring research that contradicts your ‘meta-beliefs’ is common on the Math Education battlefield. Recent academic research on “memorization” that contradicts Boaler and her entourage, is simply ignored, even that emanating from her own university. Two years ago, Shaozheng Qin and Vinod Menon of Stanford University Medical School led a team that provided scientifically-validated evidence that “rote memorization” plays a critical role in building capacity to solve complex calculations.

Based upon a clinical study of 68 children, aged 7 to 9, studied over the course of one year, their 2014 Nature Neuroscience study, Qin, Menon et al. found that memorizing the answers to simple math problems, such as basic addition or multiplication, forms a key step in a child’s cognitive development, helping bridge the gap between counting on fingers and tackling more complex calculations. Memorizing the basics, they concluded, is the gateway to activating the “hippocampus,” a key brain structure for memory, which gradually expands in “overlapping waves” to accommodate the greater demands of more complex math.

The whole debate over memorization is suspect because of the imprecision in the use of the term. Practice, drilling, and memorization are not the same, even though they get conflated in Jo Boaler’s work and in much of the current Mathematics Education literature. Back in July 2012, D.T. Willingham made this crucial point and provided some valuable points of distinction. “Practice,” as defined by Anders Ericsson, involves performing tasks and feedback on that performance, executed for the purpose of improvement. “Drilling’ connotes repetition for the purpose of achieving automaticity, which – at its worst, amounts to mindless repetition or parroting. “Memorization,” on the other hand, relates to the goal of something ending up in long-term memory with ready access, but does not imply using any particular method to achieve that goal.

Memorization has become a dirty word in teaching and learning laden with so much baggage to the point where it conjures up mental pictures of “drill and kill” in the classroom. The 2016 PISA Study appears to perpetuate such stereotyping and, worst of all, completely misses the “positive correlation” between teacher-directed or explicit instruction and better performance in mathematics.

Why does the PISA Study tend to associate memorization in home-study settings with the drudgery of drill in the classroom?  To what extent does the PISA Study on Mathematics Teaching support the claims made by Jo Boaler and her ‘Discovery Math’ advocates? When it comes to assessing the most effective teaching methods, why did the PISA researchers essentially take a pass? 

 

Read Full Post »

A lively national conversation is underway in the United States over stalled upward mobility and stark income inequality and it has a more genteel echo in Canada.  Many North American educators point to poverty as the explanation for American students’ mediocre test scores and it also serves as a favoured rationale for explaining away the wide variations in achievement levels among and within Canadian provinces. Only recently have policy analysts, boring down into the PISA 2012 Mathematics data, begun to look at the alarming achievement gap between states and provinces, the relationship between education expenditures and performance levels, and the bunching of students in the mid-range of achievement.

PISA2012CanadaGraphicThe socio-economic determinists offer a simple-minded, mono-causal explanation for chronic student under-performance. American education policy analyst Michael Petrilli and Brandon Wright of The Thomas B. Fordham Institute recently recapped the standard lines: If teachers in struggling U.S. schools taught in Finland, says Finnish educator Pasi Sahlberg, they would flourish—in part because of “support from homes unchallenged by poverty.” Michael Rebell and Jessica Wolff at Columbia University’s Teachers College argue that middling test scores reflect a “poverty crisis” in the United States, not an “education crisis.” Adding union muscle to the argument, American Federation of Teachers president Randi Weingarten calls poverty “the elephant in the room” that accounts for poor student performance.

The best data we have to tackle the critical questions comes from the OECD Program for International Student Assessment (PISA), which just released its annual Education at a Glance 2015 report.  For its own analyses, PISA uses an index of economic, social, and cultural status (ESCS) that considers parental occupation and education, family wealth, home educational resources, and family possessions related to “classical” culture. PISA analysts use the index to stratify each country’s student population into quartiles. That broadens the focus so it’s not just about addressing the under-performance of disadvantaged children.

MathScoresSES2012The PISA socio-economic analysis identifies the key variations among international educational jurisdictions. Countries like Belgium and France are relatively better at teaching their higher-status students, while other countries like Canada and Finland do relatively better at instructing students from lower-status families. Contrary to past assumptions, the United States falls almost exactly on the regression line. It does equally well (or equally poorly, if you prefer) at teaching the least well-off as those coming from families in the top quartile of the ESCS index.

A Fall 2014 Education Next report by Eric Hanushek, Paul Peterson and Ludger Woessmann pointed out the wide variations, country-to-country, in overall Mathematics proficiency.   Some 35 percent of the members of the U.S. class of 2015 (NAEP) reach or exceed the proficiency level in math. Based on their calculations, this percentage places the United States at the 27th rank among the 34 OECD countries. That ranking is somewhat lower for students from advantaged backgrounds (28th) than for those from disadvantaged ones (20th).

Overall assessments of Mathematics proficiency on PISA offer no real surprises. Compared to the U.S., the percentage of students who are math proficient is nearly twice as large in Korea (65%), Japan (59%), and Switzerland (57%). The United States also lags behind Finland (52%), Canada (51%), Germany (50%), Australia (45%), France (42%), and the United Kingdom (41%). Within the U.S., the range is phenomenal – from a high of 51% in Massachusetts to a low of 19 % in Mississippi.

Cross-national comparisons are misleading, because Canadian students have plateaued on the PISA tests over the past decade.  While Canada was still among the high-level achievers, performance of the country’s 15-year-olds in mathematics has declined, with a 14-point dip in the past nine years. While performance in reading has remained relatively stable, the decline in science performance was “statistically significant,” dipping from an average of 534 in 2006 and 529 in 2009.

MathPISA2012RangesMuch like the United States, Canada exhibits significant variations from one provincial school system to another.  A 2013 Canadian Council of Ministers of Education Canada (CMEC) review of the OECD PISA 2012 Mathematics performance levels revealed the stark achievement inequalities. Four Canadian provinces set the pace – Quebec, British Columbia, and Ontario – and the remaining six are a drag on our average scores. Fully 25% of Prince Edward Island students score Below Level 2, below the OECD average (23%), in Mathematics proficiency. The other provinces with the next highest levels of under-performers were: Manitoba (21%), Newfoundland/Labrador(21%), Nova Scotia (18%), and New Brunswick (16%).

There is no case for complacency in Canada, as pointed out, repeatedly, by Dr. Paul Cappon, former CEO of the Canadian Council on Learning (2005-2011) and our leading expert on comparative international standards. For a “high-achieving” country, Canada has a lower proportion of students who perform at the highest levels of Mathematics on recent PISA tests (CMEC 2013, Figure 1.3, p. 25).  Canada’s 15-year-olds are  increasingly bunched in the mid-range and, when it comes to scoring Level 4 and above on Mathematics,  most score at or below the OECD average of 31 %.  The proportion of high-achievers (Level 4 and above in 2012) was, as follows: PEI (22%); Newfoundland/Labrador (27%); Nova Scotia (28%); Manitoba (28%); Saskatchewan (33%); and Ontario (36%). Mathematics students from Quebec continue to be an exception because 48% of students continue to score Level 4 and above, 17 points above the OECD average score.

Students coming from families with high education levels also tend to do well on the PISA Mathematics tests. The top five OECD countries in this category are Korea (73%), Poland (71%), Japan (68%)Germany (64%) and Switzerland (65%), marginally ahead of the state of Massachusetts at 62%. Five other American states have high-achievement level proficiency rates of 58% or 59%, comparable to Czech Republic (58%) and higher than Canada (57%) and Finland (56%). Canada ranked 12th on this measure, well behind Korea, Poland, Japan, Switzerland and Germany.

Educators professing to be “progressive” in outlook tend to insist that we must cure poverty before we can raise the standards of student performance. More pragmatic educators tend to claim that Canadian schools are doing fine, except for the schools serving our disadvantaged populations, particularly Indigenous and Black children.  Taking a broad, international perspective, it appears that both assumptions are questionable. There are really two achievement gaps to be bridged – one between the affluent/advantaged and the poor/disadvantaged and the other one between Canadian high achievers and their counterparts in the top PISA performing countries.

Does low Socio-Economic Status (SES) marked by child and family poverty set the pattern for student achievement in a deterministic fashion?  To what extent can and do students break out of that mold? How critical are other factors such as better teacher quality, higher curriculum standards, and ingrained ethno-cultural attitudes? Do school systems like Canada and Finland tend to focus on reducing educational inequalities at the expense of challenging their high achievers?  Is this the real reason that many leading western G21 countries continue to lag behind those in Asia? 

Read Full Post »

Today the Organization for Economic Development and Cooperation (OECD) has succeeded in establishing the Program of International Student Assessment (PISA) test and national rankings as the “gold standard” in international education. Once every three years since 2000, PISA provides us with a global benchmark of where students 15 years of age rank in three core competencies — reading, mathematics, and science. Since its inception, United States educators have never been enamoured with international testing, in large part because American students rarely fare very well.

PISATestVisualSo, when the infamous OECD PISA Letter was published in early May 2014 in The Guardian and later The Washington Post, the academics and activists listed among the initial signatory list contained the names of some familiar American anti-testing crusaders, such as Heintz-Deiter Meyer (SUNY, Albany), David Berliner (Arizona State University), Mark Naison (BAT, Fordham University), Noam Chomsky (MIT) and Alfie Kohn, the irrepressible education gadfly. That letter, addressed to Andreas Schleicher, OECD, Paris, registered serious concerns about “the negative consequences of the PISA rankings” and appealed for a one cycle (three-year) delay in the further implementation of the tests.

The global campaign to discredit PISA earned a stiff rebuke in Canada. On June 11 and June 18, 2014, the C.D. Howe Institute released two short commentaries demonstrating the significant value of PISA test results and effectively countering the appeal of the anti-PISA Letter. Written by Education Fellow John Richards the two-part report highlighted the “Bad News” in Canada’s PISA Results and then proceeded to identify What Works (specific lessons to be learned) based upon an in-depth analysis of the once every three-year tests. In clear, understandable language, Richards identified four key findings to guide policies formulated to “put Canadian students back on track.”

The call for a pause in the PISA tests was clearly an attempt to derail the whole international movement to establish benchmarks of student performance and some standard of accountability for student achievement levels in over 60 countries around the world. It was mainly driven by American anti-testers, but the two Canadian-based signatories were radical, anti-colonialist academics, Henry Giroux (English and Cultural Studies, McMaster University) and Arlo Kempf ( Visiting Professor, Program Coordinator, School and Society, OISE).

Leading Canadian educationists like Dr. Paul Cappon (former CEO, Council on Learning) and even School Change guru Michael Fullan remain supporters of comparative international student assessments. That explains why no one of any real standing or clout from Canada was among the initial group, and, by late June, only 32 Canadian educationists could be found among the 1988 signatories from all over the globe. Most of the home-grown signatories were well known educators in what might be termed the “accountability-free” camp, many like E. Wayne Ross (UBC) and Marc Spooner (U Regina), fierce opponents of “neo-liberalism” and its supposed handmaiden, student testing.

John Richards’ recent C.D.Howe commentaries should, at least temporarily, silence the vocal band of Canadian anti-testers.  His first commentary made very effective use of PISA student results to bore deeply into our key strengths and issues of concern, province-by-province, focusing particularly on student competencies in mathematics. That comparative analysis is fair, judicious, and research-based in sharp contrast to the honey-coated PISA studies regularly offered up by the Council of Ministers of Education (Canada).

The PISA results tell the story. While he finds Canadian students overall “doing reasonably well,”  the main concern is statistical declines in all provinces in at least one subject, usually either mathematics or reading.  Quebec leads in Mathematics, but in no other subject.  Two provinces (PEI and Manitoba) experienced significant declines in all three subject areas. Performance levels have sharply declined ) over 30 points) in mathematics in both Manitoba and Canada’s former leader, Alberta. Such results are not a ringing endorsement of the Mathematics curriculum based upon the Western and Northern Canada Protocol (WNCP). 

The warning signs are, by now, well known, but the real value in Richards’ PISA Results analysis lies in his very precise explanation of the actual lessons to be learned by educators.  What really matters, based upon PISA results, are public access to early learning programs, posting of school-level student achievement results, paying professional level teacher salaries, and the competition provided by achievement-oriented private and  independent (not for profit) schools. Most significantly, his analysis confirms that smaller class sizes (below 20 pupils per class) and increasing mathematics teaching time have a negligible effect on student performance results.

The C.D. Howe PISA Results analysis hit home with The Globe and Mail, drawing a favourable editorial, but was predictably ignored by the established gatekeepers of Canada’s provincial education systems. Why the reluctance to confront such research-based, common sense findings?  “Outing” the chronic under-performance of students from certain provinces ( PEI, Manitoba, New Brunswick, and Nova Scotia) is taboo, particularly inside the tight CMEC community and within the self-referenced Canadian Education Association (CEA) circles.  For the current Chair of CMEC, Alberta Education Minister Jeff Johnson any public talk of Alberta’s precipitous decline in Mathematics is an anathema.

Stung by the PISA warning shots, Canada’s provincial education gatekeepers tend to be less receptive to sound, research-based, practical policy correctives. That is a shame because the John Richards reports demonstrate that both “sides” in the ongoing  Education War are half-right and by mixing and matching we could fashion a much more viable, sustainable, effective policy agenda. Let’s tear up the existing and tiresome Neo-Con vs. Anti-Testing formulas — and re-frame education reform around what works – broader access to early learning, open accountability for student performance levels, paying respectable, professional-level teacher salaries, and welcoming useful competition from performance-driven private and independent schools.

What’s the  recent American Public Noise over “PISAfication” all about anyway?  Why do so many North American educators still tend to dismiss the PISA Test and the sound, research-based studies stemming from the international testing movement?  To what extent do John Richards’ recent C.D. Howe Institute studies suggest the need for a total realignment of provincial education reform initiatives?

 

 

Read Full Post »

Older Posts »