Posts Tagged ‘Jo Boaler’

Developing a Growth Mindset in students and their teachers is perhaps the hottest trend in the education world outside of Canada. Originating in psychological science research conducted by Carol S. Dweck, starting in the late 1980s , and continuing at Stanford University, it burst upon the education scene in 2006 with the publication of Dweck’s influential book, Mindset: The New Psychology of Success.  The next great thing, growth mindset, became an instant buzzword phrase in many education faculties and professional development sessions.

The so-called Mindset Revolution, like most education fads, has also generated its share of imitations and mutations. Two of the best known are  the Mathematical Mindset, promulgated by Mathematics educator Jo Boaler, and a more recent Canadian spin-off, The Innovator’s Mindset, the brain-child of George Couros, a division principal of  Teaching and Learning with Parkland School District, in Stony Plain, Alberta, Canada. While Growth Mindset 1.0, got little traction in Canada, the second generation iteration dreamed up by Couros is increasingly popular among technology-savvy Canadian and American educators.

CarolDweckBannerLegions of professional educators and teachers in the United States, Britain, and Australia, have latched onto GM theory and practice with a real vengeance. One reliable barometer of ‘trendiness,” the George Lucas Educational Foundation website, Edutopia, provides a steady stream of short vignettes and on-line videos extolling the virtues of GM in the classroom. The growing list of Growth Mindset pieces @Edutopia purport to “support students in believing that they can develop their talents and abilities through hard work, good strategies, and help from others.”

What is the original conception of the Growth Mindset?  Here is how Carol Dweck explained it succinctly in the September 22, 2015 issue of Education Week: “We found that students’ mindsets—how they perceive their abilities—played a key role in their motivation and achievement, and we found that if we changed students’ mindsets, we could boost their achievement. More precisely, students who believed their intelligence could be developed (a growth mindset) outperformed those who believed their intelligence was fixed (a fixed mindset). And when students learned through a structured program that they could “grow their brains” and increase their intellectual abilities, they did better. Finally, we found that having children focus on the process that leads to learning (like hard work or trying new strategies) could foster a growth mindset and its benefits.”

GrowthMindsetModelDweck’s theory of Growth Mindsets gained credibility because, unlike most educational ‘fads,’ it did emerge out of some sound initial research into brain plasticity and was tested in case studies with students in the schools. Leading education researcher Dylan Wiliam, a renowned student assessment expert, lent his support to the Growth Mindset movement when he embraced Dweck’s findings and applied them to building ‘feedback’ into student assessment.  He adopted this equation: Talent = Hard Work + Persistence (A Growth Mindset) and offered this endorsement: “The harder you work, the smarter you get. Once students begin to understand this “growth mindset” as Carol Dweck calls it, students are much more likely to embrace feedback from their teachers.”

Ten years on, cracks appeared in the Growth Mindset movement when some of the liveliest minds in education research began to probe more deeply into the theory, follow-up studies, and the supposed evidence of student success. An early skeptic, Disappoined Idealist, hit a nerve with a brave little commentary, December 5, 2014, wondering whether the Growth Mindset described a world as we wanted it to be, rather than one as it is, and likened it to “telling penguins to flap harder( and they would be able to fly like other birds).  Self-styled ‘education progressives’ have taken their cue from American writer Alfie Kohn who weighed in with a widely-read Salon commentary in which he argued that Dweck’s research had been appropriated by “conservative” educators trying to “fix our kids” when we should be “fixing the system.”

The Growth Mindset ‘magic dust’ is wearing thin in the United Kingdom. British education gadfly David Didau,The Learning Spy, initially “pretty psyched” by Dweck’s theory, has grown increasingly skeptical over the past year or so. In a succession of pointed commentaries, he has punched holes in the assumption that all students possess unlimited “growth potential,” examined why more recent GM interventions have not replicated Dweck’s initial results, questioned whether GM is founded on pseudoscience, and even suggested that the whole theory might be “bollocks.”

Intrepid Belgian education researcher, Pedro De Bruyckere, co-author of Urban Myths About Learning and Education,  has registered his concerns about the validity of research support, citing University of Edinburgh psychologist Timothy Bates’ findings. Based upon case studies with 12-year-olds in China, Bates found no evidence of the dramatic changes in Dweck’s earlier studies: “People with a growth mindset don’t cope any better with failure. If we give them the mindset intervention, it doesn’t make them behave better. Kids with the growth mindset aren’t getting better grades, either before or after our intervention study.”

For much of the past two years, Dweck and her research associate Susan Mackie have been alerting researchers and education policy-makers to the spread of what is termed a false growth mindset” in schools and classrooms in Australia as well as Britain and the United States. Too many teachers and parents, they point out, have either misinterpreted or debased the whole concept, reducing it to simple axioms like “Praise the effort, not the child (or the outcome).” In most cases, it’s educational progressives, or parents, looking for alternatives to “drilling with standardized tests.”

GrowthMindsetFalsityDweck’s greatest fear nowadays is that Growth Mindset has been appropriated by education professionals to reinforce existing student-centred practices and to suit their own purposes. That serious concern is worth repeating: ” It’s the fear that the mindset concepts, which grew up to counter the failed self-esteem movement, will be used to perpetuate that movement.” In a December 2016  interview story in The Altantic, she conceded that it was being used in precisely that way, in too many classrooms, and it amounted to “blanketing everyone with praise, whether deserved or not.”

A “false growth mindset” arises, according to Dweck, when educators use the term too liberally and simply do not really understand that it’s intended to motivate students to work harder and demonstrate more resilience in overcoming setbacks. She puts it this way:  “The growth mindset was intended to help close achievement gaps, not hide them. It is about telling the truth about a student’s current achievement and then, together, doing something about it, helping him or her become smarter.” Far too many growth mindset disciples, Dweck now recognizes, reverted to praising students rather than taking “the long and difficult journey” in the learning process and showing “how hard work, good strategies, and good use of resources lead to better learning.”

One of Dweck’s most prominent champions, Jo Boaler, may be contributing to the misappropriation of Growth Mindset theory in her field.  As an influential Stanford university mathematics education professor, Boaler is best known as an apostle of constructivist approaches to teaching Mathematics in schools. She saw in Dweck’s Growth Mindset theory confirmation that a “fixed mindset” was harmful to kids convinced that they “can’t do Math.” It all fit nicely into her own conception of how children learn Math best – by exploration and discovery in classrooms unleashing childrens’ potential. It became, for Boaler, a means of addressing “inequalities” perpetuated by “ability groupings” in schools. It also served to advance her efforts to “significantly reposition mistakes in mathematics” and replace “crosses” with “gold stars” and whole-class “opportunities for learning.”

The Canadian mutation, George Couros’ The Innovator’s Mindset, seeks to extend Carol Dweck’s original theory into  the realm of technology and creativity. Troubled by the limitations of Dweck’s model and  its emphasis on mastery of knowledge and skills, he made an “awesome” (his word) discovery –that GM could be a powerful leadership tool for advancing “continuous creation.” In his mutation of the theory, the binary “fixed” vs. “growth” model morphs into a more advanced stage, termed the “innovator’s mindset.” In his fertile and creative mind, it is transmogrified into a completely new theory of teaching and learning.

GrowthMinsetCourosModelTaking poetic licence with Dweck’s research-based thesis, Couros spins a completely different interpretation in his fascinating professional blog, The Principal of Change:

As we look at how we see and “do” school, it is important to continuously shift to moving from consumption to creation, engagement to empowerment, and observation to application. It is not that the first replaces the latter, but that we are not settling for the former. A mindset that is simply open to “growth”, will not be enough in a world that is asking for continuous creation of not only products, but ideas. “

Promising educational theories, even those founded on some robust initial research, can fall prey to prominent educators pushing their own ‘pet ideas’ and pedagogical theories. While a 2016 Education Week report demonstrates the GM initiatives produce mixed results and British education researchers are having a field day picking apart Carol Dweck’s research findings, another version of her creation is emerging to make it even harder to assess her serious case studies being replicated around the world.

Which version of Carol Dweck’s Growth Mindset theory and practice are we assessing – the original conception or the “false” conception?  How and why did an educational theory intended to motivate students, instill a work ethic, and help kids overcome obstacles get so debased in translation into classroom practice?  Is the fate of the Growth Mindset indicative of something more troubling in the world of education research? 







Read Full Post »

With the release of the 2015 Program for International Student Assessment (PISA) on the horizon,  the Organization for Economic Cooperation and Development (OECD) Education Office has stoked-up the “Math Wars” with a new study. While the October 2016 report examines a number of key questions related to teaching Mathematics, OECD Education chose to highlight its findings on “memorization,” presumably to dispel perceptions about “classroom drill” and its use in various countries.

mathsubtractionboardThe OECD, which administers the PISA assessments every three years to 15-year-olds from around the globe, periodically publishes reports looking at slices of the data. It’s most October 2016 report,  Ten Questions for Mathematics Teachers and How PISA Can Help Answer Them, based upon the most recent 2012 results, tends to zero-in on “memorization” and attempts to show that high-performing territories, like Shanghai-China, Korea, and Chinese-Taipei, rely less on memory work than lower-performing places like Ireland, the UK, and Australia.

American Mathematics educator Jo Boaler, renowned for “Creative Math,” jumped upon the PISA Study to buttress her case  against “memorization” in elementary classrooms. In a highly contentious November 2016 Scientific American article, Boaler and co-author Pablo Zoido, contended that PISA findings confirmed that “memorizers turned out to be the lowest achievers, and countries with high numbers of them—the U.S. was in the top third—also had the highest proportion of teens doing poorly on the PISA math assessment.” Students who relied on memorization, they further argued, were “approximately half a year behind students who used relational and self-monitoring strategies” such as those in Japan and France. 

Australian education researcher Greg Ashman took a closer look at the PISA Study and called into question such hasty interpretations of the findings.  Figure 1.2: How teachers teach and students learn caught his eye and he went to work interrogating the survey responses on “memorization” and the axes used to present the data.  The PISA analysis, he discovered, also did not include an assessment of how teaching methods might be correlated with PISA scores in Mathematics.  Manitoba Mathematics professor Robert Craigen spotted a giant hole in the PISA analysis and noted that the “memorization” data related to “at-home strategies of students” not their instructional experiences and may wel;l indicate that students who are improperly instructed in class resort to memorization on their own.

mathpisateacherdirectedgraphWhat would it look like, Ashman wondered, if the PISA report had plotted how students performed in relation to the preferred methods used on the continuum from “more student-oriented instruction” to “more teacher-directed instruction.” Breaking down all the data, he generated a new graph that actually showed how teaching method correlated with higher math performance and found a “positive correlation” between teacher-directed instruction and higher Math scores. “Correlations,” he duly noted, “do not necessarily imply causal relationships but clearly a higher ratio of teacher-directed activity to student orientation.”

Jumping on the latest research to seek justification for her own “meta-beliefs” are normal practice for Boaler and her “Discovery Math” education disciples. After junking, once again, the ‘strawmen’ of traditional Mathematics — “rote memorization” and “drill,” Boaler and Zoido wax philosophical and poetic: “If American classrooms begin to present the subject as one of open, visual, creative inquiry, accompanied by growth-mindset messages, more students will engage with math’s real beauty. PISA scores would rise, and, more important, our society could better tap the unlimited mathematical potential of our children.” That’s definitely stretching the evidence far beyond the breaking point.

The “Math Wars” do generate what University of Virginia psychologist Daniel T. Willingham has aptly described as “a fair amount of caricature.” The recent Boaler-Zoido Scientific American article is a prime example of that tendency. Most serious scholars of cognition tend to support the common ground position that learning mathematics requires three distinct types of knowledge: factual, procedural and conceptual. “Factual knowledge,” Willingham points out, “includes having already in memory the answers to a small set of problems of addition, subtraction, multiplication, and division.” While some students can learn Mathematics through invented strategies, it cannot be relied upon for all children. On the other hand, knowledge of procedures is no guarantee of conceptual understanding, particularly when it comes to complexites such as dividing fractions. It’s clear to most sensible observers that knowing math facts, procedures and concepts is  what counts when it comes to mastering mathematics.

mathtimestableimageSimply ignoring research that contradicts your ‘meta-beliefs’ is common on the Math Education battlefield. Recent academic research on “memorization” that contradicts Boaler and her entourage, is simply ignored, even that emanating from her own university. Two years ago, Shaozheng Qin and Vinod Menon of Stanford University Medical School led a team that provided scientifically-validated evidence that “rote memorization” plays a critical role in building capacity to solve complex calculations.

Based upon a clinical study of 68 children, aged 7 to 9, studied over the course of one year, their 2014 Nature Neuroscience study, Qin, Menon et al. found that memorizing the answers to simple math problems, such as basic addition or multiplication, forms a key step in a child’s cognitive development, helping bridge the gap between counting on fingers and tackling more complex calculations. Memorizing the basics, they concluded, is the gateway to activating the “hippocampus,” a key brain structure for memory, which gradually expands in “overlapping waves” to accommodate the greater demands of more complex math.

The whole debate over memorization is suspect because of the imprecision in the use of the term. Practice, drilling, and memorization are not the same, even though they get conflated in Jo Boaler’s work and in much of the current Mathematics Education literature. Back in July 2012, D.T. Willingham made this crucial point and provided some valuable points of distinction. “Practice,” as defined by Anders Ericsson, involves performing tasks and feedback on that performance, executed for the purpose of improvement. “Drilling’ connotes repetition for the purpose of achieving automaticity, which – at its worst, amounts to mindless repetition or parroting. “Memorization,” on the other hand, relates to the goal of something ending up in long-term memory with ready access, but does not imply using any particular method to achieve that goal.

Memorization has become a dirty word in teaching and learning laden with so much baggage to the point where it conjures up mental pictures of “drill and kill” in the classroom. The 2016 PISA Study appears to perpetuate such stereotyping and, worst of all, completely misses the “positive correlation” between teacher-directed or explicit instruction and better performance in mathematics.

Why does the PISA Study tend to associate memorization in home-study settings with the drudgery of drill in the classroom?  To what extent does the PISA Study on Mathematics Teaching support the claims made by Jo Boaler and her ‘Discovery Math’ advocates? When it comes to assessing the most effective teaching methods, why did the PISA researchers essentially take a pass? 


Read Full Post »