Feeds:
Posts
Comments

Posts Tagged ‘Explicit Instruction’

“Wow!,” “Fantastic,” and “Inspirational”were words that filled the Twitter feed coming out of the latest Halifax Regional Centre for Education (HRCE) Innovation in Teaching Day (#HRCEID2018), held November 2 and 3, 2018.  The primary cause of the frenzied excitement was a keynote talk by Brian Aspinall, author of the edtech best-seller Code Breaker, a teacher’s guide to training-up a class of “coder ninjas.”  The former Ontario Grade 7 and 8 teacher from Essex County honed his presentation skills at TEDx Talks in Chatham and Kitchener and is now the hottest speaker on the Canadian edtech professional development circuit.

Mr. Aspinall, the #CodeBreaker, is a very passionate, motivational speaker with a truly amazing social media following. He built his first website in the 1990s before graduating from Harrow District High School, earned his B.Sc. and B.Ed. at the University of Windsor, and learned the teaching craft in the local Windsor Essex school system. In 2016, he won a Prime Minister’s Award for Teaching Excellence in STEM. Watching him in action on You Tube, it’s obvious that he’s a real showman and fairly typical of a new breed of North American edtech evangelists.

Like many edtech visionaries, Aspinall experienced an epiphany, in his case while teaching his Grade 8 class. “Someone brought to my attention that every grade 8 in our building was born in 2000 or 2001, ” he recalls. “You could hear the brain matter shift, turn, implode and explode in my head. I had never thought of it like that. My mind was blown.”  Then Aspinall remembered: “I have only taught in the 21st century…went to university in the 21st century!  And I’ve been teaching for nine years now!!”

Edtech evangelists like Aspinall have multiplied rapidly in the 2000s as provincial and school district authorities have pursued a succession of “21st century skills” initiatives. The leading motivational speakers, closely aligned with Google, Microsoft, or Pearson PLC, develop their own personalized brands and can be very persuasive engaging users without any overt marketing. The first and perhaps best known 21st century skills evangelist was Gary Kawasaki, the marketing genius who launched Apple Macintosh in 1984 and the one who popularized the use of the word “evangelist” to describe this marketing approach. The TED Talks back list is not only edtech dominated, but a ‘who’s who’ of ed tech evangelism.

Aspinall is an open book and connected almost 24/7, judging from his personal  MrApsinall.com Blog and rapid-fire Twitter feed. With 60,400 tweets to his credit, @mraspinall has amassed 40,900 followers and recorded 43,100 likes. Reading his tweets, it’s abundantly clear that he’s an unabashed educational constructivist who firmly believes in student-centred, minimal guidance, discovery learning.

Speaking on stage, Aspinall has a messianic, 21st century cool presentation style. “I’m on a mission to expose as many kids as possible to coding and computer science, ” he declared in June 2016 at TEDx KitchenerED.  That’s popular in provinces like Nova Scotia and British Columbia where coding is being implemented in elementary schools — and where teachers are hungry for classroom-ready activities. He’s filling a need, particularly among teachers in the early grades with little or no background or training in mathematics, science or computer science.

What’s contentious about the edtech evangelists is their rather uncritical acceptance of constructivist pedagogy and utopian belief that “students learn by doing’ and require minimal teacher guidance.  A few, like Brian Aspinall, are ideologues who believe that “knowledge is readily available” on the Internet, so teachers should reject teaching content knowledge and, instead, “teach and model an inquiry approach to learning.”

Aspinall’s educational philosophy deserves more careful scrutiny.  In his teaching guide and TEDx Talks, he embraces a distinctly “21st century learning” paradigm. In his 2016 TEDx talk “Hacking the classroom,” he distills his philosophy down to four “hacks” or principles: 1) focus on content creation; 2) embrace failure so kids take risks; 3) free up time and avoid time-limited tests/assignments; and 4) embrace the “process of learning” rather than the pursuit of knowledge-based outcomes.

Those principles may sound familiar because they are among the first principles of not only constructivist thinking on education, but the corporate movement driving “21st century learning” and its latest mutation, “personalized learning” enabled by computer software and information technology.   In the case of Aspinall, he’s clearly an educational disciple of the late Seymour Papert, the MIT professor who invented “logo” programming and championed ‘discovery learning’ in mathematics and science.  If Aspinall has a catechism, it is to be found in Papert’s 1993 classic, Mindstorms: Children, Computers and Powerful Ideas.

Aspinall has also latched onto the writings of Janette M. Wing, chair of Computer Science at Carnegie Mellon University in Pittsburgh, PA. One of his favourite axioms, quoted regularly, is extracted from Wing: “Computational thinking is a fundamental skill for everyone, not just for computer scientists.” She goes further: “To reading, writing and arithmetic, we should add computational thinking to every child’s analytical ability.” Indeed, like Wing, Aspinall sees “coding” as a way of teaching mathematics in a more holistic curriculum.

EdTech evangelists such as Aspinall stir interest in learning coding, but fall into the trap of assuming that constructivism works in every class, irrespective of class composition, size, or capabilities. Utopian conceptions of teaching and learning bequeathed by Papert are now being seriously challenged by evidence-based research. Classroom conditions and student management concerns conspire to limit the applicability of “makerspace learning” and teachers rarely have the resources to make it work in practice.

More fundamentally, Papert’s model of “minimal guidance” has been effectively challenged by Paul A . Kirschner, John Sweller and Richard E. Clark (2006). “Prior knowledge, ” they found, is essential in providing the “internal guidance” required in truly learning something. High quality, engaging and explicit instruction is necessary, in most instances, to ‘bootstrap” learning,  While personal exploration is useful, the most effective teaching and learning approach combines teacher guidance with exploration woven into a child’s education.

Teachers dazzled by Aspinall’s presentations are most likely immersed in edtech culture. Computer software apps and tools such as “Makey Makey” and “Scratch” are bound to make teaching easier for educators and more pleasurable for students. Few question Aspinall’s promotion of Tynker coding programs or his corporate affiliation as a “Microsoft Innovative Educator Expert Fellow.” In his TEDx Talks, he is quite open about his admiration for Microsoft philosophy. “Microsoft believes every child should be exposed to coding,” he tells audiences. “Because you don’t know you like broccoli until you try it.” While he’s not pedaling 21st century ‘snake oil,’ such statements do raise suspicions.

Why have edtech evangelists come to dominate the ’21st century skills’ professional development circuit?  What explains the popularity of, and excitement generated by, TED Talk edtech speakers such as Brian Aspinall? Is coding emerging as the “4th R” of 21st century learning and what’s its impact upon the teaching of mathematics in the early grades? Should we be more leery of champions of coding who see it as a way of introducing “computational thinking” throughout the elementary years? 

 

Advertisements

Read Full Post »

High school English language arts teacher Merion Taynton took “a leap of faith” in November 2016 and jumped in “with both feet” into Project-Based Learning (PBL).

pblgroupwork

While teaching Goethe’s Faust in her Grade 10 class in a Chinese independent school, she adopted PBL in an attempt to “grapple with the ideas” within the text rather than “the text itself.” What would you sell your soul for? How much are your dreams worth? Those were the questions Ms. Taynton posed, as she set aside her regular teaching notes on 19th Century European Literature. Students would complete their own projects and decide, on their own, how to present their findings. “I’m going to do a video,” one said. “I’m going to produce a rap song” chimed in another, and the whole approach was ‘anything goes’ as long as the students could produce a justification.

Ms. Taynton’s project-based learning experience was not just a random example of the methodology, but rather an exemplar featured on the classroom trends website Edutopia under the heading “Getting Started with Literature and Project-Based Learning.”  Better than anything else, this learning activity demonstrates not only the risks, but the obvious pitfalls of jumping on educational fads in teaching and learning.

After spreading like pedagogical magic dust over the past five years, Project-Based Learning recently hit a rough patch. Fresh educational research generated in two separate studies at Durham University’s Education Endowment Foundation (EFF) in the United Kingdom and as a component of the OECD’s Programme of International Student Assessment (PISA) has raised serious questions about the effectiveness of PBL and other minimal teacher-guided pedagogical strategies.

The EFF study of Project-Based Learning (November 2016) examined “Learning through REAL projects” involving some 4,000 Year 7 pupils  in 24 schools from 2013-14 to April 2016, utilizing a randomized control trial. The research team found “no clear impact on either literacy..or student engagement with school and learning.” More telling was the finding that the effect on the literacy of children eligible for Free School Meals (FSM) – a measure of disadvantage – was “negative and significant.” Simply put, switching to PBL from traditional literacy instruction was harmful to the most needy of all students.

explicitinstructionpisaThe 2015 PISA results, released December 6, 2016,  delivered another blow to minimal teacher-guided methods, such as PBL and its twin sister, inquiry-based learning.  When it came to achievement in science among 15-year-olds, the finding was that such minimal guided instruction methods lagged far behind explict instruction in determining student success. In short, the increase in the amount of inquiry learning that students report being exposed to is associated with a decrease in science scores.

Much of the accumulating evidence tends to support the critical findings of Paul A. Kirschner, John Sweller and Richard E. Clark in their authoritative 2006 article in Educational Psychologist.  “Minimally-guided instruction,” they concluded, based upon fifty years of studies, “is less effective and less efficient than instructional methods that place a strong emphasis on guidance of the student learning process.” The superiority of teacher-guided instruction, they claimed, can be explained utilizing evidence from studies of ” human cognitive architecture, expert-novice differences, and cognitive load.”

Project-Based Learning, like inquiry-based approaches, may have some transitory impact on student engagement in the classroom. Beyond that, however, it’s hard to find  much actual evidence to support its effectiveness in mastering content knowledge, applying thinking skills, or achieving higher scores, particularly in mathematics and science.

In September 2015, an Ontario Education What Works: Research into Practice  Monograph, authored by David Hutchison of Brock University, provided a rather mixed assessment of PBL. While the author claimed that PBL had much to offer as a “holistic strategy” promoting “student engagement” and instilling “21st century skills,” it faced “challenges that can limit its effectiveness.” Where the strategy tends to fall short was in mastery of subject content and classroom management, where time, scope and quality of the activities surface as ongoing challenges.

Implementation of PBL on a system-wide basis  has rarely been attempted, and, in the case of Quebec’s Education Reform initiative, Schools on Course,  from 1996 to 2006, it proved to be an unmitigated disaster, especially for secondary school teachers and students. The “project method” adopted in the QEP, imposed top-down, ran into fierce resistance from both teachers and parents in English-speaking Quebec, who openly opposed the new curriculum, claiming that it taught “thinking skills without subject content.” In a province with a tradition of provincial exit examinations, PBL cut against the grain and faltered when student scores slipped in 2006 in both Grade 6 provincial mathematics tests and the global Trends in International Math and Science Study (TIMSS) assessments.

pblhandsonlearning

None of the critical research findings or claims of ineffectiveness have blunted the passion or commitment of PBL advocates across North America.  With the support of the ASCD’s Educational Leadership magazine and web platforms such as Edutopia, a handful of PBL curriculum and program experts, including Jane Krauss of International Society for Technology Education (ISTE), Suzie Boss of Stanford’s Center for Social Innovation, and Dr. Sylvia Chard of the University of Alberta, have been effective in planting it in hundreds of school systems from Oregon and California to New York State and Ontario, in New England and the Maritime Provinces.

The PBL movement in North America is propelled by progressive educational principles and an undeniable passion for engaging students in learning.  Powered by 21st century learning precepts and championed by ICT promoters, it rests upon some mighty shaky philosophical foundations and is supported by precious little research evidence. Lead promoter Suzie Boss is typical of those advocates. “Projects make the world go round,” she wrote in a 2011 Edutopia Blog post, and “Confucius and Aristotle were early proponents of learning by doing.” That may be quite imaginative, but it is also completely fallacious.

Most of the PBL “research” is actually generated by one California organization, the Buck Institute for Education, where the lead promoters and consultants are schooled in its core principles and where PBL facilitators develop teaching units and workshops. It’s actively promoted by ISTE, Edutopia, and a host of 21st century skills advocates.

Even Canadian faculty of education supporters like Hutchison concede that implementing PBL is “time-intensive” and fraught with classroom challenges. Among those “challenges” are formidable obstacles such as a) managing the significant time commitment; b) ensuring that subjects have sufficient subject depth; c) balancing student autonomy with the imperative of some teacher direction; and d) keeping projects on track using ongoing (formative) assessment instruments. When it comes to implementing PBL in ESL/ELL classrooms or with larger groups of Special Needs students those challenges are often insurmountable.

What works best as a core instructional approach – explicit instruction or minimal teacher guided approaches, such as PBL and Inquiry-Based Learning?  Which approach is best equipped to raise student achievement levels, particularly in mathematics and science?  Are the potential benefits in terms of promoting student engagement and instilling collaborative skills enough to justify its extensive use in elementary schools? 

 

 

 

 

 

 

 

Read Full Post »

“Canadians can be proud of our showing in the 2015 Programme for International Student Assessment (PISA) report,” declared Science consultant Bonnie Schmidt and former Council of Ministers of Education (CMEC) director Andrew Parkin in their first-off-the mark December 6, 2016 response to the results. “We are, ” they added, “one of only a handful of countries that places in the top tier of the Oganization for Economic Development and Cooperation (OECD) in each of the three subjects tested:science, reading and math.”

pisa2015cmeccover“Canada” and “Canadian students,” we were told, were once again riding high in the once-every-three-years international test sweepstakes. If that that effusively positive response had a familiar ring, it was because it followed the official line advanced by a markedly similar CMEC media release, issued a few hours before the commentary.

Since our students, all students in each of our ten provincial school systems, were “excelling,” then it was time for a little national back-slapping. There’s one problem with that blanket analysis: it serves to maintain the status quo, engender complacency, obscure the critical Mathematics scores, and disguise the lopsided nature of student performance from region to region.

Hold on, not so fast, CMEC — the devil is in the real details and more clearly portrayed in the OECD’s own “Country Profile” for Canada. Yes, 15-year-olds in three Canadian provinces (Alberta, British Columbia, and Quebec) achieved some excellent results, but overall Mathematics scores were down, and students in over half of our provinces trailed-off into mediocrity in terms of performance. Our real success was not in performance, but rather in reducing the achievement gap adversely affecting disadvantaged students.

Over half a million 15-year-olds in more than 72 jurisdictions all over the world completed PISA tests, and Schmidt and Parkin were not alone in making sweeping pronouncements about why Canada and other countries are up and others down in the global rankings.

Talking in aggregate terms about the PISA performance of 20,000 Canadian students in ten different provinces can be, and is, misleading, when the performance results in mathematics continue to lag, Ontario students continue to underperform, and students in two provinces, Manitoba and Saskatchewan, struggle in science, reading, and mathematics.  Explaining all that away is what breeds complacency in the school system.

My own PISA 2015 forecast was way off-base — and taught me a lesson.  After the recent TIMSS 2015 Mathematics results released in November 2016, an  East Asian sweep, led by Singapore and Korea, seemed like a safe bet. How Finland performs also attracts far less attention than it did in its halcyon days back in 2003 and 2006. The significant OECD pivot away from excellence to equity caught me napping and I completely missed the significance of moving (2012 to 2015) from pencil-and-paper to computer-based tests. 

Some solace can be found in the erroneous forcecasts of others. The  recent Alberta Teachers’ Association (ATA) “Brace Yourself” memo with its critique of standardized testing assessment, seemed to forecast a calamitous drop in Alberta student performance levels. It only happened in Mathematics.

Advocates of the ‘Well-Being’ curriculum and broader assessment measures, championed by Toronto’s People for Education, will likely be temporarily thrown off-stride by the OECD’s new-found commitment to assessing equity in education. It will be harder now to paint PISA as evil and to discredit PISA results based upon such a narrow range of skills in reading, math and science.

The OECD’s “Country Profile” of Canada is worth studying carefully because it aggregates data from 2003 to 2015, clarifies the trends, and shows how Canadian students continue to struggle in mathematics far more than in reading and science.

Canadian students may have finished 12th in Mathematics with a 516 aggregate score, but the trend line continues to be in decline, down from 532 in 2003. Digging deeper, we see that students in only two provinces, Quebec ( 544) and BC (522) actually exceeded the national mean score. Canada’s former leader in Mathematics performance, Alberta, continued its downward spiral from the lofty heights of 549 (2003) to 511 (2015).

Since Ontario students’ provincial mathematics scores are declining, experts will be pouring over the latest PISA results to see how bad it is in relation to the world’s top performing systems. No surprises here: Ontario students scored 509, finishing 4th in Canada, and down from 530 on PISA 2003. Excellence will require a significant change in direction.

The biggest discovery in post-2015 PISA analysis was the positive link between explicit instruction and higher achievement in the 2015 core assessment subject, science. The most important factor linked with high performance remains SES (soci0-economic status), but teacher-guided instruction was weighted close behind and students taught with minimal direction, in inquiry or project-based classes, simply performed less well on the global test.

The results of the 15-year-olds are largely determined over 10 years of schooling, and not necessarily the direct consequence of the latest curriculum fad such as “discovery math.’’

It’s better to look deeper into what this cohort of students were learning when they first entered the school system, in the mid-1990s. In the case of Canadian students, for example, student-centred learning was at its height, and the country was just awakening to the value of testing to determine what students were actually learning in class.

Where the student results are outstanding, such as Singapore and Estonia, it is not solely attributable to the excellence of teaching or the rigour of the math and science curriculum.

We know from the “tutoring explosion” in Canada’s major cities that the prevalence of private tuition classes after school is a contributing factor, and may explain the current advantage still enjoyed in mathematics by Pacific Rim students.

Children of Chinese heritage in Australia actually outperformed students in Shanghai on the 2012 PISA test, and we need to explore whether that may be true for their counterparts in Greater Vancouver. The so-called “Shanghai Effect” may be attributed as much to “tiger mothers” as it is to the quality of classroom instruction.

Whether Canada and Canadians continue to exhibit high PISA self-esteem or have simply plateaued does not matter as much as what we glean over the next few years from studying best international practice in teaching, learning, and assessment.

Surveying PISA student results, this much is clear: standing still is not an option in view of the profound changes that are taking place in life, work, and society.

 

Read Full Post »

With the release of the 2015 Program for International Student Assessment (PISA) on the horizon,  the Organization for Economic Cooperation and Development (OECD) Education Office has stoked-up the “Math Wars” with a new study. While the October 2016 report examines a number of key questions related to teaching Mathematics, OECD Education chose to highlight its findings on “memorization,” presumably to dispel perceptions about “classroom drill” and its use in various countries.

mathsubtractionboardThe OECD, which administers the PISA assessments every three years to 15-year-olds from around the globe, periodically publishes reports looking at slices of the data. It’s most October 2016 report,  Ten Questions for Mathematics Teachers and How PISA Can Help Answer Them, based upon the most recent 2012 results, tends to zero-in on “memorization” and attempts to show that high-performing territories, like Shanghai-China, Korea, and Chinese-Taipei, rely less on memory work than lower-performing places like Ireland, the UK, and Australia.

American Mathematics educator Jo Boaler, renowned for “Creative Math,” jumped upon the PISA Study to buttress her case  against “memorization” in elementary classrooms. In a highly contentious November 2016 Scientific American article, Boaler and co-author Pablo Zoido, contended that PISA findings confirmed that “memorizers turned out to be the lowest achievers, and countries with high numbers of them—the U.S. was in the top third—also had the highest proportion of teens doing poorly on the PISA math assessment.” Students who relied on memorization, they further argued, were “approximately half a year behind students who used relational and self-monitoring strategies” such as those in Japan and France. 

Australian education researcher Greg Ashman took a closer look at the PISA Study and called into question such hasty interpretations of the findings.  Figure 1.2: How teachers teach and students learn caught his eye and he went to work interrogating the survey responses on “memorization” and the axes used to present the data.  The PISA analysis, he discovered, also did not include an assessment of how teaching methods might be correlated with PISA scores in Mathematics.  Manitoba Mathematics professor Robert Craigen spotted a giant hole in the PISA analysis and noted that the “memorization” data related to “at-home strategies of students” not their instructional experiences and may wel;l indicate that students who are improperly instructed in class resort to memorization on their own.

mathpisateacherdirectedgraphWhat would it look like, Ashman wondered, if the PISA report had plotted how students performed in relation to the preferred methods used on the continuum from “more student-oriented instruction” to “more teacher-directed instruction.” Breaking down all the data, he generated a new graph that actually showed how teaching method correlated with higher math performance and found a “positive correlation” between teacher-directed instruction and higher Math scores. “Correlations,” he duly noted, “do not necessarily imply causal relationships but clearly a higher ratio of teacher-directed activity to student orientation.”

Jumping on the latest research to seek justification for her own “meta-beliefs” are normal practice for Boaler and her “Discovery Math” education disciples. After junking, once again, the ‘strawmen’ of traditional Mathematics — “rote memorization” and “drill,” Boaler and Zoido wax philosophical and poetic: “If American classrooms begin to present the subject as one of open, visual, creative inquiry, accompanied by growth-mindset messages, more students will engage with math’s real beauty. PISA scores would rise, and, more important, our society could better tap the unlimited mathematical potential of our children.” That’s definitely stretching the evidence far beyond the breaking point.

The “Math Wars” do generate what University of Virginia psychologist Daniel T. Willingham has aptly described as “a fair amount of caricature.” The recent Boaler-Zoido Scientific American article is a prime example of that tendency. Most serious scholars of cognition tend to support the common ground position that learning mathematics requires three distinct types of knowledge: factual, procedural and conceptual. “Factual knowledge,” Willingham points out, “includes having already in memory the answers to a small set of problems of addition, subtraction, multiplication, and division.” While some students can learn Mathematics through invented strategies, it cannot be relied upon for all children. On the other hand, knowledge of procedures is no guarantee of conceptual understanding, particularly when it comes to complexites such as dividing fractions. It’s clear to most sensible observers that knowing math facts, procedures and concepts is  what counts when it comes to mastering mathematics.

mathtimestableimageSimply ignoring research that contradicts your ‘meta-beliefs’ is common on the Math Education battlefield. Recent academic research on “memorization” that contradicts Boaler and her entourage, is simply ignored, even that emanating from her own university. Two years ago, Shaozheng Qin and Vinod Menon of Stanford University Medical School led a team that provided scientifically-validated evidence that “rote memorization” plays a critical role in building capacity to solve complex calculations.

Based upon a clinical study of 68 children, aged 7 to 9, studied over the course of one year, their 2014 Nature Neuroscience study, Qin, Menon et al. found that memorizing the answers to simple math problems, such as basic addition or multiplication, forms a key step in a child’s cognitive development, helping bridge the gap between counting on fingers and tackling more complex calculations. Memorizing the basics, they concluded, is the gateway to activating the “hippocampus,” a key brain structure for memory, which gradually expands in “overlapping waves” to accommodate the greater demands of more complex math.

The whole debate over memorization is suspect because of the imprecision in the use of the term. Practice, drilling, and memorization are not the same, even though they get conflated in Jo Boaler’s work and in much of the current Mathematics Education literature. Back in July 2012, D.T. Willingham made this crucial point and provided some valuable points of distinction. “Practice,” as defined by Anders Ericsson, involves performing tasks and feedback on that performance, executed for the purpose of improvement. “Drilling’ connotes repetition for the purpose of achieving automaticity, which – at its worst, amounts to mindless repetition or parroting. “Memorization,” on the other hand, relates to the goal of something ending up in long-term memory with ready access, but does not imply using any particular method to achieve that goal.

Memorization has become a dirty word in teaching and learning laden with so much baggage to the point where it conjures up mental pictures of “drill and kill” in the classroom. The 2016 PISA Study appears to perpetuate such stereotyping and, worst of all, completely misses the “positive correlation” between teacher-directed or explicit instruction and better performance in mathematics.

Why does the PISA Study tend to associate memorization in home-study settings with the drudgery of drill in the classroom?  To what extent does the PISA Study on Mathematics Teaching support the claims made by Jo Boaler and her ‘Discovery Math’ advocates? When it comes to assessing the most effective teaching methods, why did the PISA researchers essentially take a pass? 

 

Read Full Post »