Feeds:
Posts
Comments

Archive for the ‘Student Assessment’ Category

With the release of the 2015 Program for International Student Assessment (PISA) on the horizon,  the Organization for Economic Cooperation and Development (OECD) Education Office has stoked-up the “Math Wars” with a new study. While the October 2016 report examines a number of key questions related to teaching Mathematics, OECD Education chose to highlight its findings on “memorization,” presumably to dispel perceptions about “classroom drill” and its use in various countries.

mathsubtractionboardThe OECD, which administers the PISA assessments every three years to 15-year-olds from around the globe, periodically publishes reports looking at slices of the data. It’s most October 2016 report,  Ten Questions for Mathematics Teachers and How PISA Can Help Answer Them, based upon the most recent 2012 results, tends to zero-in on “memorization” and attempts to show that high-performing territories, like Shanghai-China, Korea, and Chinese-Taipei, rely less on memory work than lower-performing places like Ireland, the UK, and Australia.

American Mathematics educator Jo Boaler, renowned for “Creative Math,” jumped upon the PISA Study to buttress her case  against “memorization” in elementary classrooms. In a highly contentious November 2016 Scientific American article, Boaler and co-author Pablo Zoido, contended that PISA findings confirmed that “memorizers turned out to be the lowest achievers, and countries with high numbers of them—the U.S. was in the top third—also had the highest proportion of teens doing poorly on the PISA math assessment.” Students who relied on memorization, they further argued, were “approximately half a year behind students who used relational and self-monitoring strategies” such as those in Japan and France. 

Australian education researcher Greg Ashman took a closer look at the PISA Study and called into question such hasty interpretations of the findings.  Figure 1.2: How teachers teach and students learn caught his eye and he went to work interrogating the survey responses on “memorization” and the axes used to present the data.  The PISA analysis, he discovered, also did not include an assessment of how teaching methods might be correlated with PISA scores in Mathematics.  Manitoba Mathematics professor Robert Craigen spotted a giant hole in the PISA analysis and noted that the “memorization” data related to “at-home strategies of students” not their instructional experiences and may wel;l indicate that students who are improperly instructed in class resort to memorization on their own.

mathpisateacherdirectedgraphWhat would it look like, Ashman wondered, if the PISA report had plotted how students performed in relation to the preferred methods used on the continuum from “more student-oriented instruction” to “more teacher-directed instruction.” Breaking down all the data, he generated a new graph that actually showed how teaching method correlated with higher math performance and found a “positive correlation” between teacher-directed instruction and higher Math scores. “Correlations,” he duly noted, “do not necessarily imply causal relationships but clearly a higher ratio of teacher-directed activity to student orientation.”

Jumping on the latest research to seek justification for her own “meta-beliefs” are normal practice for Boaler and her “Discovery Math” education disciples. After junking, once again, the ‘strawmen’ of traditional Mathematics — “rote memorization” and “drill,” Boaler and Zoido wax philosophical and poetic: “If American classrooms begin to present the subject as one of open, visual, creative inquiry, accompanied by growth-mindset messages, more students will engage with math’s real beauty. PISA scores would rise, and, more important, our society could better tap the unlimited mathematical potential of our children.” That’s definitely stretching the evidence far beyond the breaking point.

The “Math Wars” do generate what University of Virginia psychologist Daniel T. Willingham has aptly described as “a fair amount of caricature.” The recent Boaler-Zoido Scientific American article is a prime example of that tendency. Most serious scholars of cognition tend to support the common ground position that learning mathematics requires three distinct types of knowledge: factual, procedural and conceptual. “Factual knowledge,” Willingham points out, “includes having already in memory the answers to a small set of problems of addition, subtraction, multiplication, and division.” While some students can learn Mathematics through invented strategies, it cannot be relied upon for all children. On the other hand, knowledge of procedures is no guarantee of conceptual understanding, particularly when it comes to complexites such as dividing fractions. It’s clear to most sensible observers that knowing math facts, procedures and concepts is  what counts when it comes to mastering mathematics.

mathtimestableimageSimply ignoring research that contradicts your ‘meta-beliefs’ is common on the Math Education battlefield. Recent academic research on “memorization” that contradicts Boaler and her entourage, is simply ignored, even that emanating from her own university. Two years ago, Shaozheng Qin and Vinod Menon of Stanford University Medical School led a team that provided scientifically-validated evidence that “rote memorization” plays a critical role in building capacity to solve complex calculations.

Based upon a clinical study of 68 children, aged 7 to 9, studied over the course of one year, their 2014 Nature Neuroscience study, Qin, Menon et al. found that memorizing the answers to simple math problems, such as basic addition or multiplication, forms a key step in a child’s cognitive development, helping bridge the gap between counting on fingers and tackling more complex calculations. Memorizing the basics, they concluded, is the gateway to activating the “hippocampus,” a key brain structure for memory, which gradually expands in “overlapping waves” to accommodate the greater demands of more complex math.

The whole debate over memorization is suspect because of the imprecision in the use of the term. Practice, drilling, and memorization are not the same, even though they get conflated in Jo Boaler’s work and in much of the current Mathematics Education literature. Back in July 2012, D.T. Willingham made this crucial point and provided some valuable points of distinction. “Practice,” as defined by Anders Ericsson, involves performing tasks and feedback on that performance, executed for the purpose of improvement. “Drilling’ connotes repetition for the purpose of achieving automaticity, which – at its worst, amounts to mindless repetition or parroting. “Memorization,” on the other hand, relates to the goal of something ending up in long-term memory with ready access, but does not imply using any particular method to achieve that goal.

Memorization has become a dirty word in teaching and learning laden with so much baggage to the point where it conjures up mental pictures of “drill and kill” in the classroom. The 2016 PISA Study appears to perpetuate such stereotyping and, worst of all, completely misses the “positive correlation” between teacher-directed or explicit instruction and better performance in mathematics.

Why does the PISA Study tend to associate memorization in home-study settings with the drudgery of drill in the classroom?  To what extent does the PISA Study on Mathematics Teaching support the claims made by Jo Boaler and her ‘Discovery Math’ advocates? When it comes to assessing the most effective teaching methods, why did the PISA researchers essentially take a pass? 

 

Read Full Post »

Educational talk about “grit” – being passionate about long-term goals, and showing the determination to see them through –seems too be everywhere in and around schools. Everywhere, that is, except in the rather insular Canadian educational world. Teaching and measuring social-emotional skills are on the emerging policy agenda, but “grit” is (so far) not among them.

GritFaceGirlGrit is trendy in American K-12 education and school systems are scrambling to get on board the latest trend.  A 2007 academic article, researched and written by Angela Duckworth, made a compelling case that grit plays a critical role in success.  Author Paul Tough introduced grit to a broad audience in his 2013 book How Children Succeed: Grit, Curiosity, and the Hidden Power of Character, which went on to spend a year on the New York Times bestseller list.  And in the same year, Duckworth herself gave a TED talk, which has been viewed more than 8 million times online.

Since then, grit initiatives have flourished in United States school systems. Some schools are seeking to teach grit, and some districts are attempting to measure children’s grit, with the outcome contributing to assessments of school effectiveness. Angela Duckworth’s new book, Grit: The Power of Passion and Perseverance, is one of the hottest North American non-fiction titles this publishing season.  In spite of the flurry of public interest, it has yet to register in the Canadian educational domain.

GritDuckworthBookCoverOver the past three years the Ontario-based People for Education (P4ED) advocacy organization has been pursuing the goal of broadening the existing measures of student success to embrace “social-emotional skills” or competencies. With a clear commitment to “move beyond the ‘3R’s” and redefine the established testing/accountability framework, P4ED founder Annie Kidder and the well-funded Toronto-centred research team have been creating a “broad set of foundational skills” and developing a method of “measuring schools’ progress toward those goals.”

The Ontario P4ED initiative, billed as “Measuring What Matters “(MWM), proposes a draft set of “Competencies and Skills” identified as Creativity, Citizenship, Social-Emotional Learning, and Health — all to be embedded in what is termed “quality learning environments” both in schools and the community. The proposed Ontario model makes no reference whatsoever to cognitive learning and subject knowledge or to the social-emotional aspects of grit, perseverance or work ethic.

The P4ED project has a life of its own, driven by a team of Canadian education researchers with their own well-known hobby horses. Co-Chair of the MWM initiative, former BC Deputy Minister of Education Charles Ungerleider, has assembled a group of academics with impeccable “progressive education” (anti-testing) credentials, including OISE teacher workload researcher Nina Bascia and York University self-regulation expert Stuart Shanker.

A 2015 MWM project progress report claimed that the initiative was moving from theory to practice with “field trials” in Ontario public schools. It simply reaffirmed the proposed social-emotional domains and made no mention of Duckworth’s research or her “Grit Scale” for assessing student performance on that benchmark. While Duckworth is cited in the report, it is for a point unrelated to her key research findings. The paper also assumes that Ontario is a “medium stakes” testing environment in need of softer, non-cognitive measures of student progress, an implicit criticism of the highly regarded Ontario Quality and Accountability Office system of provincial achievement testing.

GritGrowthMindsetWhether “grit” or any other social-emotional skills can be taught — or reliably measured — is very much in question. Leading American cognitive learning researcher Daniel T. Willingham’s latest American Educator essay (Summer 2016) addresses the whole matter squarely and punches holes in the argument that “grit” can be easily taught, let alone assessed in schools. Although Willingham is a well-known critic of “pseudoscience” in education, he does favour utilizing “personality characteristics” for the purpose of “cultivating” in students such attributes as conscientiousness, self-control, kindness, honesty, optimism, courage and empathy, among others.

The movement to assess students for social-emotional skills has also raised alarms, even among the biggest proponents of teaching them. American education researchers, including Angela Duckworth, are leery that the terms used are unclear and the first battery of tests faulty as assessment measures.  She recently resigned from the advisory board of a California project, claiming the proposed social-emotional tests were not suitable for measuring school performance.  “I don’t think we should be doing this; it is a bad idea,” she told The New York Times.

Why are leading Canadian educators so committed to developing “social-emotional” measures as alternatives to current student achievement assessment programs? Should social-emotional competencies such as “joy for learning” or “grit”  be taught more explicity in schools?  How reliable are measures of such “social-emotional skills” as creativity, citizenship, empathy, and self-regulation? 

Read Full Post »

Alberta’s most unlikely hero, Physics teacher Lynden Dorval, has finally been vindicated.  Two years after he was fired in September 2012 by the Edmonton Public Schools for giving his high school students zeros for incomplete work, an Alberta appeal tribunal ruled on August 29, 2014 that he was “unfairly dismissed” and restored his lost salary and pension. There is justice, it seems, in the education world.  The bigger question is – how did it happen and will it encourage more teachers to stand-up  against eroding educational standards?

LyndenDorvalEPSPhotoThe Physics teacher at Ross Sheppard High School, was a 33-year veteran with an “unblemished” teaching record.  He stood his ground when a new Principal arrived and intervened to end the common practice of teaching students a valuable life lesson – failing to hand in an assignment or missing a test without a valid reason – would result in a mark of zero. In Dorval’s case, he even gave students fair warning and a second chance before taking that step.  It worked because Dorval , according to the tribunal, had “the best record in the school and perhaps the province for completion rates.”

The “no zeros” issue  came to a head when the school’s computer generated reports were programmed to substitute “blanks” for zeros, eliminating the practice.  Dorval considered banning zeros “a stupid idea” and said he “simply couldn’t follow it.”  Two other teachers did the same but escaped any repercussions.

The Alberta tribunal’s decision supported Dorval because he had raised very legitimate questions about whether the policy was good for students.  In the wording of the decision, “the school board did not act reasonably in suspending the teacher. The implementation of the new assessment policy has several demonstrable problems.” Specifically, since there was “no accountability or penalty for missing assignments in the new policy, there was little incentive for a student to actually complete the assignment.”

The written ruling was particularly harsh in its criticism of the principal and former superintendent Edgar Schmidt.  It agreed that Dorval was made an example for challenging the principal’s authority and found that the policy was imposed without proper consultation with teachers, students, or parents. Even more telling, the tribunal was very critical of the Edmonton board for denying Dorval due process during its September 2012 dismissal hearing.

The sheer idiocy of the Edmonton Public Schools student assessment policy was clear to most outside the system. Faced with a groundswell of resistance, the Edmonton board of elected trustees itself backtracked, approving a revised student assessment policy (protecting the Lynden Dorvals) and explicitly allowing zero as a possible mark.

School system Student Evaluation policy remains a total mystery to most parents and to tuned-in high school students.  Over the past two decades, provincial testing programs and school-based student evaluation have been moving in opposite directions.

Provincial tests such as the Ontario EQAO assessments hold students accountable for measuring up to criteria-referenced standards, while school board consultants promote the new “Assessment for Learning” (AfL) theories, pushing-up graduation rates through a combination of “no fail” and “do-over” student evaluation practices.  Defenders of such ‘soft, pass everyone’ practices like AfL consultant Damian Cooper   tend to see enforcing higher standards as a dire threat to student self-esteem.

Public school authorities have a way of silencing independently-minded teachers and many pay a professional price for openly expressing dissenting views. A small number of those educators stumble upon Canadian independent schools which generally thrive on giving teachers the freedom to challenge students and to actually teach.  Thousands of public school teachers just accept the limits on freedom of expression, soldier on and mutter, below their breath, “I’m a teacher, so I’m not allowed to have an opinion.”

Why did Lynden Dorval become an Alberta teacher hero?  It comes down to this: He said “No” to further erosion of teacher autonomy and standards.

 

Read Full Post »

A Calgary Catholic District school, St. Basil Elementary and Junior High, made headlines in late October when principal Craig Kittelson sent a letter to Grade 7 to 9 parents announcing the elimination of the academic honour roll and end-of-year awards ceremonies.  The controversial Letter to Parents cited the work of American popular writer Alfie Kohn, including the contention that “dangling rewards in front of children are at best ineffective, and at worst counterproductive.”  A Postmedia news story by Trevor Howell in the Calgary Herald and the National Post gave extensive coverage to the eruption of “parent outrage” over both the decision and the way it was summarily announced to the community.

Alfie-KohnAxing the Academic Honour Roll reignited a public debate over the common practice of giving awards as an incentive to encourage academic achievement. The Calgary Catholic District School Board was caught flat-footed by the outrage. Scrambling for a plausible explanation, the National Post turned to Alfie Kohn’s leading Canadian disciple, Red Deer elementary teacher Joe Bower who operates the blog for the love of learning.  While news reports referenced Joe Bower’s 2007 move to end awards ceremonies at Red Deer’s Westpark Middle School, they made no mention of his related initiatives abandoning homework and refusing to give grades. Nor did the media report that he did so after experiencing an epiphany while reading Kohn’s article “The Costs of Overemphasizing Achievement.”

After “discovering” Kohn, Bower has been on a mission.  He’s become a serial @AlfieKohn retweeter,  while bouncing from school to school and ending up teaching special needs kids in ungraded classes at the Red Deer Regional Hospital.  In September 2013, Bower published a co-edited collection of so-called “progressive education” articles entitled de-testing and de-grading schools, complete with a glowing foreword by none other than his mentor,  Alfie Kohn.  Almost simultaneously, the Canadian Education Association published a feature article by Bower in Education Canada (Fall 2013) on “Telling Time with a Broken Clock,” the trouble with standardized testing. Kohn’s fingerprints are all over Bower’s articles and posts, hammering away at the evils of academic rewards, homework, and student testing of any kind. It makes you wonder whether this once repudiated, retooled agenda is actually the hidden curriculum of the CEA and its acolytes.

Whatever got into the Calgary Catholic District Board to actually sanction the axing of academic awards?  When pressed for a rationale, the CCDSB posted a rather bizarre summary of the “education research” intended to support the decision and come to the rescue of Kittelson,  the beleaguered school principal.  Surveying that short brief, makes for fascinating reading because it leads off by quoting American radical critic John Taylor Gatto, a leading “unschooler” opposed to compulsory schooling, then cherry picks evidence from Alfie Kohn’s favourite sources.  As a validation for the policy, it’s a classic example of a selective, politically-driven education research “mash-up” — the very kind that has landed education research in such bad odour in academe.

Just when it appeared that America’s leading progressive gadfly was fading in influence, Bower and a new generation of disciples are taking up the cause. Having heard Alfie Kohn speak at a Quebec  English Teachers Conference in Montreal in the early 2000s, I have seen–first hand– his tremendous gifts as an orator and felt the allure of his iconoclastic ideas, until I began to consider the consequences of putting those ideas into actual practice.  Born in Miami Beach, Florida, the preppy-looking, reed thin author and lecturer, now in his late 50s, has authored a dozen books with catchy titles such as No Contest: The Case Against Competition (1986), Punished by Rewards: The Trouble with Gold Stars (1993), The Case Against Standardized Testing (2000), The Homework Myth (2006), and Feel Bad Education (2011).  He has staying power, judging from the steady stream of simple Kohn axioms spewing out of Bower and his other camp followers.

Like most educational evangelicals, Kohn has undeniable appeal, especially to North American teachers, tapping into their very real feelings of alienation, powerlessness, and resistance to imposed change.  He finds a ready audience because he has identified a vein of dissent and resistance running though the rank and file teacher forces, often manifested in opposition to top-down educational decision-making.  Academic critics like Daniel Willingham, author of Why Don’t Students Like School, point out that Kohn is effective as an agent provocateur and likely “not bad for you or dangerous to your children.”  He raises important questions, but, according to Willingham, “should not be read as a guide to the answers” because his writings “cannot be trusted as an accurate summary of the research literature.” In his reply to Willingham, Kohn held his ground, while conceding that some of his distillations run the risk of oversimplying complex issues.

One of the most incisive assessments of Alfie Kohn comes from Michael J. Petrilli of The Thomas B. Fordham Institute, an American education gadfly of a different stripe. Writing in the March 2012 issue of Wisconsin Interest, Petrilli hit the mark: “Kohn’s arguments are half-crazy and half-true, which is what makes him so effective — and so maddening.” He also provides a useful corrective to Kohn’s particular educational worldview. “What fuels the modern school reform movement,” he claims, “is not acquiescence to Corporate America but outrage over the nation’s lack of social mobility.” You can be sure this will not appear on Joe Bower’s blog or in one of his next tweets.

What fuels American education gadfly Alfie Kohn’s zealous contrarianism and various progressive education crusades?  How much of Kohn’s core progressivist ideology is rooted in the teachings of John Dewey and Jean Piaget — and what proportion is pure creative imagination?  What has Kohn actually contributed to the education world in terms of sound policy ideas?  What does explain his continuing influence and undeniable capacity to attract new adherents?

Read Full Post »

The arrival of Nova Scotia Student Report Cards in June 2013 provoked quite a reaction from parents, particularly those with students in Atlantic Canada’s largest school board.   “Ridiculous.” “Meaningless.” “Mumbo-jumbo.”  Those were just a few of the words used by Halifax region parents to describe the computer-generated InSchool reports utilizing PowerSchool, the province’s  new $6 million student information system.   After an initial attempt by Deputy Education Minister Carole Olsen to deflect the stinging criticism, the Minister Ramona Jennex, Chair of CMEC, was compelled to intervene, promising to look into the concerns.

InSchoolBannerEducation officials in Nova Scotia were clearly taken aback by the reaction and acted like it was news to them that the canned reports were incomprehensible to most parents and virtually every student, certainly in the elementary grades.  Education Reporter Frances Willick deserves credit for unearthing the latest parental outcry, but the concerns are not new, nor will they be fixed by a few cosmetic changes on student reports.

Some twenty years ago, educators were confronted with a wave of educational reform focused on introducing Outcome Based Education (OBE) and increasing demand for standardized testing to provide more assurance of student achievement levels. Most educational administrators and consultants reacted instinctively against such intrusions and adapted by developing some rather ingenious counter measures.  Unable to stop the advance of standardized assessments, they resorted to retaining control of student reporting and turning it to different purposes.

Student reports cards have been filled with gibberish since at least  the early 1990s. Much of it can be traced back to a February 1993 Ontario Ministry of Education document known as The Common Curriculum, Grades 1 -9.  That document introduced teachers to the term “learning outcomes” and attempted to destream Grade 9 and replace a subject-based curriculum with more holistic cross-curricular understandings.  In the case of Language Learning Outcomes, reading was downgraded and the word “spelling” dropped from the elementary lexicon.  After parents rose up calling for “a set curriculum with specific goals,” the so-called “dumbed-down curriculum” was shelved, but a new student evaluation system based upon “meeting provincial outcomes” survived.

CouldDoBetterOCERIntended as a means of providing parents with regular communication reflecting measurable standards, OBE student report cards become quite the reverse.  School curriculum was re-written around “learning outcomes” and professional development was geared to teaching teachers the new “educratic” language.  Communicating with parents gradually morphed from providing personal, often candid comments about students, to tiny snippets reflecting the “expected outcomes.”  When Student Reports became standardized and machine-generated, the “student outcomes” jargon became entrenched and accepted, rather sadly, as a demonstration of teaching competence.

Standardizing report cards became a vehicle for implementing the new orthodoxy.  In Canada’s provincial systems, OBE was captured by educators who opposed testing and sought to undercut its influence with “assessment for learning.”  Grading and ranking students, once the staple of teaching,  became dirty words and the initial standardized reports sought to replace marks with measures of formative assessment.  Ontario’s first Standardized Report Card, eventually rejected, attempted to introduce a new set of incredibly vague measures such as “developing,” developed, and “fully developed” understandings.

Today’s standardized report cards, as bad as they are, could well have been worse. Many professional teachers, particularly in high schools, resisted the “dumbing down” of curriculum and the invasion of ‘politically-correct’ report writing.  The leading teachers’ unions, the OSSTF, BCTF, and NSTU, opposed student reporting that chipped away at teacher autonomy and consumed more and more of a teacher’s time to complete.  In Nova Scotia, for example, the Grade 9 to 12 reports may be littered with edu-babble comments, but they still provide percentage grades.  In Grades 1 to 8, the reports still retain a watered down letter grade system.

A closer look at the Nova Scotia Power School report card reveals that the grading system has also been impacted. In Grades 1 to 8, for example, the range of grades has been narrowed to reflect the goal of equality of outcomes. The highest grade possible is now “A” and it signifies “meeting learning outcomes,” and “F” has been eliminated entirely. No one, it seems, can be outstanding or “exceed expected outcomes.”

The current flap over Nova Scotia report cards is simply the latest manifestation of a much deeper problem.  Education authorities favour standardized student information systems and Nova Scotia’s 2010 full adoption of Power School was mostly driven by the need to track student attendance.  The reporting module, adapted from the template with few adaptations, was essentially an afterthought. It is quite clear now that Power School was a Trojan Horse for the advance of standardized reports and the latest wave of “canned reporting” and “robo comments.”

What can we finally secure personalized, common sense Student Report Cards?  Is it just a matter of linguistic cosmetics or part of a much deeper problem entrenched in the current educational system?   What needs to be undone, before we restore sanity to student reporting?

Read Full Post »

A truly weird little You Tube video, Education And Accountability, was posted on February 7, 2013 in an attempt to arouse support for a resurgent Canadian anti -student testing movement. Inspired by the appeal of the polished RSA animated videos, the amateurish Canadian version is a thinly veiled effort to discredit Ontario’s Education Quality and Accountability Office (EQAO) and its well-established system of standardized student testing.  The broadside did not come from out-of-the-blue but rather sprang from the  team of Action Canada researchers who produced the 2012 policy paper, Real Accountability or Illusion of Success?, a call to review standardized testing in Ontario.

TestingIllusionCoverAsking ‘How Much Testing is Too Much Testing?‘ is a very legitimate and reasonable question to ask.  Since 1995, and the creation of EQAO, student testing has expanded in direct response to growing public demand for accountability in Canadian K-12 public education.  What the Action Canada team of Sebastien Despres, Steven Kuhn, Pauline Ngirumpatse, and Marie Josee Parent have done is to call for a review of the entire public accountability structure in the system. That video lays bare their hidden agenda which is to undermine hard-won public accountability in a system with a chronic aversion to responding to parent, student, or citizen concerns.

Reviewing the Action Canada report is painful for those familiar with the struggle in the early 1990s to bring back a semblance of public accountability to a runaway public education system.  The Ontario Royal Commission on Learning, for example, is identified as the point of origin of student testing, completely ignoring the public advocacy of the Coalition for Education Reform (1992-1995).  That’s a forgivable sin, but to ignore the rising public demand, even the role of Dr. Dennis Raphael in the Ministry of Education and Paul Cappon at the Council of Ministers of Education(CMEC) is truly amazing.  It is clear that the authors have no idea whatsoever about how alien the concept of accountability for student performance was in the Ontario public system.

The Action Canada author’s research is not only narrowly circumscribed, it’s incredibly selective.  Studying student assessment policies and considering only the work of anti-testers is what gives “education research” a bad name.  It’s actually entertaining to see the OISE “progressive” faction. most notably David Livingston and Kari Delhi, cited approvingly, while the true assessment experts, Mark Holmes and Stephen Lawton, warrant not a mention.  In places, the report shows that the young authors have not done their homework. Even the spin that the report puts on the Commission on Learning’s recommendation sounds like the later repentant thoughts of Co-Chair Gerald Caplan.

The Action Canada team has taken a run at the entire public accountability system in Ontario public education.  Their “Task Force” recommendations call upon the Ontario government to review: A. The Structure of the Tests relative to Objectives; B. The Impact of Testing within the Classroom; C. The Validity of Test Results; and D. Public Reporting and Use of Test Results.  Simply put, the little band of neo-progressives are asking whether, not how much, testing is good for student learning.

The report bears the unmistakable fingerprints of Canada’s leading foe of standardized testing, New York-born University of Ottawa education professor Dr. Joel Westheimer.  While the young researchers did hold a panel and hear other viewpoints, it’s obvious that they have swallowed whole Dr. Westheimer’s lively commentaries and inspirational talks.  Westheimer’s take on the excesses of No Child Left Behind, entitled  “No Child Left Thinking,” should have been referenced because it definitely contributed significantly to their thinking.

The Action Canada report is clearly aimed at bringing Ontario’s public accountability system to the ground.  It’s a clumsy, ill-considered attempt to turn-back-the-clock to a time when no one had to be accountable for much of anything in or out of the classroom.  The Canadian Education Association and the Ontario Principals’ Council were, of course, among the first to “like” the report on Facebook. Thoughtful critics of standardized testing, like Dr. Diane Ravitch and the American Common Core research group are rightly concerned about two impacts: the steady erosion of  Social Sciences teaching time and the potential for misuse in teacher evaluation.  It’s a little too obvious that the young researchers are out to “kill testing” instead of simply stopping student test results from being factored into value-added teacher evaluation programs.

What’s driving the recent move to review and limit standardized student testing in Canadian schools?  Does the Action Canada research report hold any water, let alone suggest a way forward?  Why did the organized voices of Canadian teachers and principals jump so quickly in endorsing this thin little policy paper?  What in the world makes educators so afraid of testing when they spend much of their careers testing and grading kids?

Read Full Post »

Cheaters do not really prosper in schools, but many are now being given a “second chance.”   In a few Canadian and American school districts, giving students a second chance to pass tests, examinations, and other assignments, has actually become accepted as “student assessment” policy promoting a unique 21st century concept of “fairness.”  In Newfoundland’s largest school board, the Eastern School District, the policy was changed in October 2011 so students cheating or plagiarizing will no longer be assigned a mark of zero.  http://www.thestar.com/news/canada/article/1075276–cheating-students-get-second-chance-in-newfoundland

The Newfoundland and Labrador school board’s policy change is not what it seemed – an isolated and rather bizarre deviation from sound education policy. The tradional “automatic zero” is dying a slow death, aided and abetted by student assessment experts, and being supplanted by “do-over” evaluation practice in schools across North America. The Eastern School Board Superintendent Ford Rice was quite accurate when he claimed that the policy was driven by “current literature in education” and was “consistent in philosophy” with policies in other boards across Canada.  http://www.cbc.ca/news/pdf/nl-evaluation-regulations-20111005.pdf

Publicly announcing the Newfoundland school board’s new policy is what really sparked a firestorm of protest. President of the provincial Teachers’ Association Lily Cole spoke out, saying that teachers were not only frustrated but very unhappy with the policy which took responsibility for teaching “responsibility, respect, honesty, and values” away from regular teachers. “This just takes it out of our hands,” she told both CBC News and The Toronto Star.

“Students will not be given zeros for cheating,” Rice insisted, because the Board’s educational philosophy was to “separate student behaviour from learning to give us a true picture of what the student knows.”  Rising to defend the new student cheating policy on the airwaves was perhaps the leading exponent of “do-over” student assessment, Ontario education consultant Damian Cooper.  In the old system, he claimed, students who “failed at the test” were “tossed onto the heap ” and branded “non-achievers or low-achievers.”   http://www.cbc.ca/news/canada/newfoundland-labrador/story/2011/10/25/nl-cheating-student-reaction-teachers-1025.html

A close examination of newly revised Student Assessment policies in a cross-section of school boards in Ontario, British Columbia, Nova Scotia, New Brunswick, and Ohio is most revealing. Most of the policies are like that of the Halifax Regional School Board (C.007 Program, 1.2.6.4), clearly separating the evaluation of student achievement from that of student behaviour.  Indeed, many use the same wording when separating the two and virtually identical to that found in Damian Cooper’s book, Talk About Assessment.  http://damiancooperassessment.com/talk.html  In his more recent offering, Redefining Fair, he goes even further in trying to dispel “outdated beliefs regarding fairness” in so-called “mixed-ability classrooms.”

What’s really happening in the strange world of student assessment?  A small band of learning assessment experts, led by Damian Cooper and one of his mentors, Scarborough consultant  Ken O’Connor, The Grade Doctor, exert a tremendous influence over school administrators and consultants with little or no background in testing or evaluation. “First and foremost,” O’Connor preaches, ” accuracy requires that behaviours and attitudes be separated from achievement, so that grades are pure measures of achievement.”  According to this iron dictum, late penalties, absence, academic dishonesty, or even bonus marks have no place in determining student grades. And furthermore, awarding percentage marks is unacceptable because “no one can accurately describe 101 levels” of proficiency. http://www.ascd.org/ascd-express/vol5/503-newvoices.aspx

Student assessment experts like Damian Cooper pop up everywhere because most school boards are desperate to improve their idiosyncratic, autonomous, teacher-driven student evaluation practices. Over the 2009-10 school year, Cooper was hired to give “Tools for Assessment” Workshops from one end of the country to another, including prominent recorded talks in Vancouver, Barrie,ON,  and Sackville, NB.  From July 5 to 8, 2010, he was the sole presenter a a two-day intensive Workshop, entitled “Fostering Assessment Literacy in Our Schools” sponsored by the CMEC -Atlantic section, and funded by NB Education and all four teachers unions.

How were Damian Cooper’s assessment theories seeded in the Maritimes?  Look no further than the the Assessment Summit, held in late August 2009, at Halifax’s World Trade and Convention Centre.  Close to 600 school officials and teachers attended the extravaganza headlined by Damian Cooper, Ken O’Connor, and Rick Stiggins, head of Educational Testing Service (ETS) Assessment from Portland, Oregon.

A Media Advisory issued by the NSTU left no doubt about the actual purpose of the education Summit. ” These most distinguished assessment experts,” the SSRSB’s Sue Taylor-Foley stated,will illustrate the fundamental purpose of assessment is not to rate, rank, and sort students, but rather to provide meaningful feedback that leads to improved student learning.”  The core theme, she emphasized, was to promote “Common Assessment” across schools in Nova Scotia and beyond.  http://www.nstu.ca/images/pklot/MA_NSELC09.pdf

Since the Newfoundland cheating policy change hit the news, an eerie silence has descended upon Student Assessment Divisions in most Canadian school boards.  Superintendent Rice and NLSBA Executive Director Brian Shortall, supported by Cooper, have been fending off a wave of vocal opposition, leveled by irate parents, taxpayers, teachers and high school students.  Over 75% of all respondents to a CBC News St. John’s  poll were adamantly opposed to “pardoning” student cheaters.  On the CBC Radio Maritime Magazine show (October 29), “Mind the Gap,”  Shortall offered a rather feeble defense of the change and received some tacit support from NB Superintendent Karen Branscombe (NB District 2, Moncton).

Not every Canadian school board has given up on curbing student cheating and plagiarism. The Toronto and District School Board policy on “Academic Honesty” stands out as a prime example.  “Cheating and plagiarism will not be condoned,” the TDSB policy (PR613) proclaims. What happens if a student violates that policy?  “A mark of zero may be awarded for the assignment in question and a repeated pattern of academic dishonesty may result in an escalating severity of consequences.”

Giving student cheaters a second chance is symptomatic of profound changes now underway in student assessment policy.  Where is the educational research to support the student evaluation theories being espoused by Damian Cooper and his cohorts?  Does separating completely student achievement from student behaviour in the evaluation process make any real sense — and what are the likely consequences? Should student cheaters be pardoned in our schools?  Taking the larger view,  is all of this threatening to produce what might be called a “do-over” generation?

Read Full Post »