Feeds:
Posts
Comments

Archive for the ‘Student Assessment’ Category

COVID19LearningLoss

The COVID-19 pandemic shock knocked out Canada’s provincial school systems and we are now seeing the residual effects. Speaking recently on TVO Ontario’s The Agenda, Western University education professor Prachi Srivastava  cut through the usual edu-babble: “I’m shocked at the lack of planning, at the lack of forward planning in the face of what is quite a predictable outcome,” referring to the short and long-term consequences of mass school closures.

When Srivastava speaks, education authorities should be listening and heeding her advice. She’s one of the few Canadian education researchers attuned to global education development and co-lead author of the June 2021 Ontario Science Table brief on the impact of educational disruption not only in Ontario but from province-to-province in Canada. Back in July 2021, she and the research team issued a follow-up report confirming the cumulative learning loss and social harms inflicted since March 2020 and recommending that, barring catastrophic circumstances, schools should remain open for in-person learning for the foreseeable future.

A pan-Canadian scan of Canadian K-12 COVID-related education plans conducted by Toronto-based People for Education and released in early February, after two years of disrupted schooling, came up virtually empty.  While all provinces and territories have public health safety strategies for schools, few have anything approaching a vision or plan to manage, assess or respond to learning loss or the psych-social impact of mass school closures and none have allocated sufficient funding to prepare for post-pandemic recovery.

A near total lack of student data is seriously hampering our capacity to assess how the pandemic has affected student learning over the past two years.  “One of the problems we have,” Srivastava told the London Free Press, “is that there is no baseline data.”  That is confirmed, in spades, in the recent People for Education report. Only four of our 10 provinces and territories, British Columbia, Ontario, Quebec and New Brunswick, are engaged (even in the 2021-22 school year) in any form of data collection, and it’s irregular at best.

EducationFeb9 1600x900 2

As a G7 country, Canada is purportedly one of the seven most highly industrialized and relatively well-resourced liberal democracies on the planet, and it has, relatively speaking, one of the smallest cohorts of children, some 5.1 million, in elementary and secondary school. With all those resources and one of the most extensive educational bureaucracies in the world, it’s fair to ask why our school systems came up short during the pandemic.

Four mass school closings in Ontario have cost K-12 students some 29 weeks of schooling since March 2020, roughly double the average lost time, 14 to 16 weeks, across all advanced industrial societies. While Ontario leads in weeks claimed by school closures, most other provinces are close behind, with Nova Scotia and New Brunswick, for example, checking-in at 20 to 22 weeks of disrupted instructional time. In the case of Nova Scotia, it’s compounded by the fact that 4 to 6 additional days have been lost to storm day closures where teachers are not required to provide alternative instruction.

Suspending or curtailing system-wide student assessments has compounded the problem. With Ontario’s Education Quality and Accountability Office (EQAO) testing cancelled during the pandemic, there was no way to assess how that province’s two million students were performing or whether they were recovering. “My assessment,” Srivastava claims, “is that we could have used the EQAO in a different way. We could have used it to monitor what the baseline was…then we could have rerun the EQAO.”

The Ontario pattern was repeated elsewhere as provinces, one-after-another, abandoned large-scale student assessments and suspended high school examinations. Maintaining consistent and credible benchmark assessments would certainly have made logical sense and left us better prepared to plan for the recovery. While some provinces, including Ontario and Nova Scotia have restored testing in 2021-22, it’s going to be difficult to analyze without consistent baseline data.

School authorities have failed us during the COVID-19 pandemic and it will prove costly for the pandemic generation of children. A child who was in Kindergarten in March 2020, is now in Grade 2 and will be in grade 3 in September 2022, so pandemic closures will have cost them between 10 and 27 weeks of their schooling, Students in Grade 9 when COVID-19 hit will have had their entire high school years disrupted by closures and mostly ineffective online learning experiments.

Repeated pivots to emergency home learning were detrimental to school age children and families, and education was used as a “pandemic control” instrument without sufficient recognition of the academic and social impacts on children and teens. Public policy devolved into complying with public health dictates, and responding – in ad hoc fashion on the fly – to educator and parent concerns, applying band-aid upon band-aid, from social distancing to bubble to HEP filter units, to secure a modicum of consent, several times, to restart in-person school.

Serious research into COVID-19’s impact on student learning is gradually emerging and, given the preoccupations of our education schools, it originates mostly elsewhere.  Studies in the United Kingdom during COVID-19 point to a learning loss of between 2 months and 2 years, depending upon the educational jurisdiction. One of the few Canadian studies, conducted by University of Alberta researcher George Georgiou, found that students in Grades 1 and 2 in Edmonton and Vermillion performed, on average, 8 months to a full year below grade level on reading tasks at the end of the last school year. More recently, a U.S. study, conducted from 2019 to 2022 by Amplify utilizing DIBELs assessments, found that more than I in 3 children from Kindergarten to Grade 3 fell significantly short of their expected reading level without major and systematic interventions.

A more coherent, integrated and responsive pandemic education recovery plan is now a matter of immediate necessity.  At the risk of sounding like a broken record, the key components of such a plan, repeated articulated by Srivastava, me and others, are hiding in plain sight.  Such a comprehensive plan would consist of three main education recovery initiatives:

  • Revamp the entire K-12 curriculum – recognizing that it’s a massive “catch-up operation” in which parts of the curriculum in each year need to be lengthened, some curriculum moved into the next grade, and other parts missed earlier integrated into the current grade.
  • Boost core competencies and skills in reading and numeracy – close the basic skills gap while introducing pro-social skills throughout the curriculum for all children, focusing on the elementary grades.
  • Implement targeted interventions – focusing on schools with the highest number of disruptions and infection rates, or large numbers of students from marginalized communities or special needs students.

Three years ago, Canadian K-12 education occupied a bubble and the architects of the current school system were fond of routinely referring to Ontario as a “world class system.”  When the pandemic hit, prominent Canadian school promoters saw it as a golden opportunity to “build back better” with a focus on enhancing social and emotional learning.  What a difference a Pandemic makes. It’s now a recovery mission and there’s no room for complacency.

Why did Canadian school systems shut down their student assessment programs during the two-year long pandemic?  What explains the lack of preparedness and the inability to respond effectively in overseeing, monitoring, and reporting on student academic progress and well-being? When can Canadian parents and educators expect to see some strategy and plans for learning recovery in the wake of the pandemic? 

Read Full Post »

AssistiveTechnologyStrugglingBoyAssistiveTechnologyHeadset

Rising children’s reading scores in Ontario may well be an illusion.  Early literacy rates as measured on Ontario standardized test have, we now know, been inflated by the use of Assistive Technology (AT).  That was the biggest revelation contained in a ground-breaking September 2021 report, Lifting the Curtain on EQAO Scores, produced by the Ontario branch of the International Dyslexia Association (IDA/Ontario).

“There are so many students struggling to read whose experiences are being hidden right now,” says Alicia Smith, president of IDA Ontario. “Our goal in producing this report is to bring attention to the depth of the real issues. These are being swept under the carpet.”

Ontario’s provincial student assessment agency, the Education Quality and Accountability Office (EQAO), has produced some problematic data. Between 2005 and 2019, the EQAO reported a steady increase in reading scores for students in grades 3 and 6.  On the Grade 3 test, the proportion of students meeting the provincial standard reportedly jumped from 59 to 74 per cent, a 15-point gain in the prime indicator of literacy.

What the EQAO did not publicly disclose was that increasing numbers of students were being provided with ‘accommodations’ such as AT when writing the test, which most likely inflated the numbers. Nearly one in five students (18 per cent) utilized AT to complete the EQAO assessment in 2019, up from 3 per cent back in 2005.

Assistive technology is now commonplace in Canadian schools, widely used to diagnose reading difficulties and to provide computer-assisted help with reading. During provincial tests, students with diagnosed reading difficulties are now routinely allowed to either listen to an audio version of the text and comprehension questions.  In many cases, they are accommodated by having adults, either a teacher or a volunteer, who is permitted to write down the student’s verbal response.

Gains in Ontario early reading scores shriveled up almost entirely when the use of assistive technology was factored into presenting the actual results. Whereas 56 per cent of students met the standard without the use of assistive technology in 2005, the figure was only marginally higher at 62 per cent in 2019.

Reported pass rates for the Grade 10 Ontario Secondary School Literacy Test (OSSLT) have also been flagged as a cause for concern. While the EQAO reports that the percent of successful ‘first time eligible’ students has hovered between 80 and 82 per cent, the non-participation rate has more than doubled, rising from 8.4 per cent in 2005 to 19 per cent in 2019. Little is known about students who do not write the OSSLT, but Toronto District School Board data reveals that two-thirds (65 per cent) of students who do not participate in the OSSLT do not end up applying for post-secondary education.

AssistiveTechOSSLTPass

When provided with appropriate early instruction, an estimated 95 per cent of all students are cognitively capable of learning to read. In, Ontario and every other Canadian province, the IDA and many reading experts see a large gap between childrens’ human potential and current reading outcomes.

Experienced literacy experts and tutors have seen it all over the years.  “It’s a complete joke,” says Jo-Anne Gross, founder of Toronto-based Remediation Plus. “Most of the kids diagnosed and coded don’t have learning disabilities. They just don’t know how to read.”  Gross applauds IDA Ontario for exposing the hidden problem. “The authenticity of the reading scores is sadly lacking,” she claims, “and the public has a right to full disclosure.”

Ontario parent David Logan, a Kingston father of a Grade 5 son struggling with reading, told CBC News in October 2021 that assistive technology was little help to his son in mastering reading skills and his local public school had no plan to help him progress beyond needing the device. He’s fairly typical of many concerned parents who have come forward to testify at hearings of the ongoing Ontario Right to Read inquiry into human rights issues affecting students with reading disabilities.

While assistive technology can be very useful in helping educators to diagnose particular reading skills deficits, it is problematic when utilized to ‘read’ to students and produce scripts on standardized literacy tests. There are some unintended consequences.  It’s not just the technology, notes University of Toronto clinical psychologist Todd Cunningham, it’s more about the “accommodations” made in completing the test.  He explains what actually happens: “When there are teachers in the room, it’s natural for them to help out struggling kids.“

The recent Ontario revelations inflated EQAO literacy scores do give us some indication of what to expect when the much-anticipated Right to Read public inquiry report finally lands in the spring of 2022.

Why are so many younger students still struggling with reading?  Is there any substitute for effective instruction in early reading?  Should school systems implement end of grade 1 phonics checks as a matter of policy? What is an appropriate role for the use of Assistive Technology? Should AT be used by students completing provincial assessments? If so, does the public have a right to know the extent of its use and literacy rates unassisted by such technology?

Read Full Post »

The most recent April 2021 Fraser Institute report on Mathematics performance of students across Canada contained very few surprises. Students from Quebec continue to be at the head of the class. On the benchmark Program of International Student Assessment (PISA) from 2003 to 2018, they scored the highest (532 in 2018), 20 points above the Canadian average, and continued to outpace those of any other province. Steep declines have been registered by students from Alberta (- 38 points), British Columbia (-34 points), and Saskatchewan (- 31 points). Students from two Maritime provinces, Nova Scotia and New Brunswick, have steadily declined and now hover around the OECD mean score of 494. 

MathStruggleBoyatBoard

Most interesting to analyze is New Brunswick because it exemplifies why Canadian students produce such mediocre results. With PISA scores dropping from 511 (2003) to 491 (2018),  New Brunswick 15-year-old students perform well below the national mean scores on a “steadily negative ” trajectory over the past fifteen years. On the past three PISA tests, 2012 to 2018, their scores have declined by 2.2 per cent, third worst among the provinces. The national Grade 8 PCAP results for 2010 to 2016 while below the national mean do show slight improvement, albeit on assessments keyed to provincial curriculum standards.  What jumps out at you in the report, however, is the row of blanks for provincial math assessments in New Brunswick and the statement “insufficient data to estimate trends.”

Assessing student capabilities in mathematics should, one would think, be a provincial priority when there’s plenty of evidence that students are still struggling in math. The clearest example of this, confirmed in interviews with math tutors over the past two weeks, is that most N.B. students today are so lacking in basic computational skills that they cannot complete secondary school math placement tests without a calculator.

Calculator dependence is now widespread in New Brunswick schools and its most telling impact is in the lagging Mathematics achievement of students.  The use of calculators in North American math classrooms has been common since the 1980s, but top performing nations, such as Singapore, China and Korea, put far more emphasis on integrating mental computation with conceptual understanding before progressing to higher-level math and problem solving. That approach is also reflected in the most successful after-school math tutoring programs such as Kumon Math and the Toronto-based alternative, the Spirit of Math, widely used in Ontario independent schools.

Provincial school officials do not generally react to periodic reports that students are struggling in mathematics, pointing to rising teacher-assigned student grades and healthy graduation rates. Those in the ‘shadow school system’ of private tutoring and the math assessment offices of universities and colleges have no such inhibitions. Most are alarmed at what they see and learn while conducting intake assessments of prospective students. Most perform one or two grades below expected levels and, moving upwards through the grades, wide variations appear in students’ skill levels and competencies.

‘Discovery Math’ is the prevailing teaching approach in the vast majority of N.B. elementary schools and the tutors insist it’s not working for far too many students. “Most students have gaps in their skills,” says Rhonda Connell, manager of Fredericton’s Kumon Math and Reading operation with 28-years of tutoring experience. “The N.B. curriculum is not skills-based, but rather more exploratory of different methods.”

What’s wrong with that approach?  “Students in public schools without basic skills get taught long and complicated operations and the kids get lost,” Connell tells me. “They don’t know their mental math and that’s why high school students simply cannot do the Kumon placement test without a calculator.”

The mathematics deficits grow as students progress from elementary grades into high school. “There’s a widening gap,” says Connell. She finds that students do not know their fractions, cannot do long division or basic subtraction and borrowing operations. The bottom line: “Students don’t have the skills at hand to engage in problem-solving and higher-level math.”

The founder of Mathnasium in Moncton, Jocelyn Chan, saw through the eyes of her son, now 7-years-of age, that mathematics education was sadly lacking. As a CPA with plenty of corporate finance experience, she decided to do something about it by opening the first Mathnasium franchise operation in Atlantic Canada. Since opening in October 2020, it’s grown from 4 or 5 students to 70 enrolments today with a majority of students in Grades 5 and 6 where the math deficits become more pronounced and visible to parents.

The pandemic shutdowns and default to hybrid learning have set students back, particularly in a more teacher-dependent subject like mathematics. “A lot of Moncton area students were already behind to begin with,” Chan says, ‘so the learning loss is more acute.” “Lots of Grade 9s this year are struggling,” she notes, “because of COVID-19 causing them to lose half of their grade 8 year, leaving them unprepared for the next grade.”

Private tutoring after-school programs such as Kumon and Mathnasium both cater to upwardly, mobile, affluent families with the financial resources to afford such programs. Out of 331 Kumon operations in Canada, there’s only one in New Brunswick.  While the Fredericton Kumon centre run by Connell has grown steadily from some 30 to 40 students in 1993 to 141 students today, that’s still a small fraction of the total student population.

Many of the new clients also turn out to be newcomers, recently arrived in the province. Most local parents, according to Connell and Chan, only become concerned when they see their children falling behind or getting lower grades. “People moving here from elsewhere,” Connell notes, “expect more” and “come to Kumon saying that there’s nothing going on in the schools.”

Unaddressed math problems surface again when students proceed on to university and find themselves in popular programs like management, marketing, or economics where some math skills are required to master the core content.  Many turn to mathematics and language remediation programs.

Senior Math instructor C. Hope Alderson is on the front-lines as coordinator of the UNB- SJ Flora Beckett Mathematics and Science Help Centre. As a mathematics tutor, she spends most of her time building the skills and confidence of students struggling in their university courses. Choosing her words carefully, Dr. Hope Alderson confirms what private after-school tutors say about today’s students. “Student have quite an attachment to the calculator,” is how she puts it. “There’s certainly less emphasis on mental computations in today’s schools. They grab the calculator to do simple calculations.”

The pandemic is not helping the situation. Faced with stay-at-home orders, students and families were left with online remedial programs or strictly-limited in-person, socially-distanced tutoring. Enrollment in Kumon Fredericton peaked in 2019, just before the school shutdown.  Since then, home learning and family stresses have kept families away from Kumon.  “Family stresses ran high,” says Connell, “and it had an effect on students’ abilities to focus on their math.” Separation from their social group was especially hard on teenage students.

Mastery of basic math skills is being sadly neglected in our K-12 schools. Conceptual understanding should not be emphasized to the virtual exclusion of mental computation skills. Getting a calculator to do the mathematics for you contributed to the entrenched problem.

*An earlier version of this commentary appeared in The Telegraph-Journal, provincial edition, In New Brunswick.

Why are Canadian students losing ground in Mathematics on the benchmark PISA tests administered every three years?  What can we learn from a case study looking at the state of math competencies in New Brunswick? Is it a combination of factors?  If so, what needs to be done to address the underperformance of our students on international assessments?  

Read Full Post »

Mr. Zero to Hero: Alberta Physics teacher Lynden Dorval, May 2012

Suspending Alberta diploma exams in October and November 2020 is understandable in the midst of a global pandemic, but it will have unintended consequences. Replacing exams with sound, reliable, standards-based and replicable alternative forms of summative assessment is a formidable challenge. Taking a longer-term view, it will most likely only exacerbate the gradual and well-documented slide in the province of Alberta’s graduation standards.

While some students and the parents retained the right to write exams, the die is cast and it may also signal the death knell for final exams in a province once hailed for having Canada’s best education system. Eliminating final exams, as demonstrated in my new book The State of the System, has hidden, longer-term consequences, significantly contributing to the ‘big disconnect’ between rising student attainment (i.e., graduation rates and averages) and stagnating or declining achievement.

Critics of exams contend that formal, time-limited assessments cause stress and can affect student well-being. Such claims are disputed by Canadian teen mental health experts, including Stan Kutcher and Yifeng Wei, as well as cognitive scientists like Erin Maloney who cite evidence-based research demonstrating that tests and exams are examples of the “normal stress” deemed essential to healthy human development.

Sound student evaluation is based upon a mix of assessment strategies, including standardized tests and examinations. Testing remains a critical piece, countering more subjective forms of assessment. UK student assessment expert, Daisy Christodoulou, puts it this way: “Tests are inhuman – and that is what is good about them.”

While teacher-made and evaluated assessments appear, on the surface, to be more gentle and fairer than exams, such assessments tend to be more impressionistic, not always reliable, and can produce outcomes less fair to students. They are also laden with potential biases.

A rather extensive 2015 student assessment literature review, conducted by Professor Rob Coe at Durham University, identifies the typical biases. Compared to standardized tests, teacher assessment tends to exhibit biases against exceptional students, specifically those with special needs, challenging behaviour, language difficulties, or personality types different than their teacher. Teacher-marked evaluations also tend to reinforce stereotypes, such as boys are better at math or racialized students underperform in school.

Grade inflation has been an identified and documented concern in high schools since the 1980s, long before the current pandemic education crisis. Two Canadian sociologists, James Cote and Anton Allahar, authors of Ivory Tower Blues (2007), pinpointed the problem of high school students being “given higher grades for less effort” and expecting the same in Ontario universities. One authoritative study, produced at Durham University in the UK, demonstrated that an ‘A’ grade in 2009 was roughly equivalent to a ‘C’ grade in 1980.

What has happened to Alberta high school graduation standards? Back in 2011, Maclean’s magazine ranked Alberta as Canada’s best system of education based upon the performance of its graduating students. With compulsory provincial exams in place in the core subjects, some 20 per cent of Alberta’s Grade 12 students achieved an ‘A’ average, compared to roughly 40 per cent of students across Ontario high schools.

Grading standards in Alberta were demonstrably more rigorous than those in Ontario and other provinces. The University of Calgary’s Dean of Arts described Ontario high schools as being engaged in “an arms race of ‘A’s.’ A 2011 University of Saskatoon admissions study of 12,000 first-year university students’ grades reported that Alberta high school graduates dropped 6.4 percentage points, compared to as much as 19.6 points for those from other provinces. In 2017-18, a leaked University of Waterloo admissions study revealed that the average Ontario student dropped 16 per cent.

“No fail’ and ‘no zero’ student assessment policies proliferated in the early 2000s and most of the resistance stemmed from secondary school teachers, particularly in Alberta. Senior grade subject teachers in Mathematics and Science were in the forefront of the underground battles over teachers’ autonomy in the classroom. Constraining teachers from assigning “zeros’ for incomplete or missing work proved to be the biggest bone of contention.

It flared up in Alberta in May 2012 when Edmonton physics teacher Lynden Dorval, a thirty-three-year veteran with an unblemished teaching record, was suspended, then fired, for continuing to award zeroes, refusing to comply with a change in school assessment policy. It all came to a head when the school board’s computer-generated reports substituted blanks for zeroes. An Alberta tribunal found that Dorval gave students fair warning, and that his methods worked because he had “the best record in the school and perhaps the province for completion rates.” The previously obscure Alberta Physics teacher went from “zero to hero” when he was exonerated, but it proved to be a small victory on the slippery slope to dumbed-down standards.

Grade inflation seeped into Alberta high schools when that province moved away from weighting exams at 50 per cent (to 30 per cent) of the final subject grade. In June 2016, under the new policy, 96 per cent of Math 30-1 students were awarded a passing grade, compared to 71 per cent of those who took the diploma exam, a gap of 25 percentage points. The same pattern was evident in Nova Scotia up until June 2012 when the province eliminated all Grade 12 provincial exams. Since Nova Scotia moved its provincial exams from Grade 12 to Grade 10, that province’s graduation rates have skyrocketed from 88.6 percent to 92.5 percent in 2014–15

While far from perfect, exams do provide not only a more rigorous form of summative assessment, but a fairly reliable benchmark of how students perform across a provincial system. It is, after all, next-to-impossible to establish comparability or assessment benchmarks to assess the alternatives such as uneven and highly idiosyncratic ‘demonstrations of learning.’

The Alberta system, once rated Canada’s best on the basis of its graduation standards, is gradually losing its edge. Suspending the diploma exams in 2020-21 may turn out to be a temporary blip or stand as further evidence of an abandonment of more rigorous graduation standards.

Why did Alberta lose its undisputed status as Canada’s best education system? How important were final exams in solidifying that province’s graduation standards? What is the connection between final diploma exams and two key performance indicators — grade inflation and graduation rates? Why have the universities remained relatively silent while evidence accumulates testifying to the softening of graduation standards?

Read Full Post »

The Homework Debate never seems to go away.  Popular books and articles inspired initially by American education writer Alfie Kohn and his Canadian disciples continue to beat the drum for easing the homework burden on students or eliminating homework altogether before the secondary school level. That “No Homework” movement made significant inroads in the United States and Canada during the 2000’s. The Organization for Economic Cooperation and Development (OECD), responsible for the Program of International Assessment (PISA) test, confirmed that the amount of time students in North America spend on doing homework had declined, as of the 2014 assessment year.

HomeworkHackItHomeworkCaseAgainst2006

 

A critical question needs to be asked: Has the “No Homework” movement and the apparent push-back against homework had an adverse effect on student achievement? That’s difficult to answer because, despite the critical importance of the issue and the long history of homework research, few North American researchers have been inclined to study the role that homework plays in enhancing student achievement, even in mathematics.

One little-known researcher, Lake B. Yeworiew, an Ethiopian scholar, based at the University of Calgary, but recently-arrived in Canada, saw the hole in the research and recently tackled the whole question. His focus was on assessing the relationship between homework and Grade 8 Mathematics student achievement, comparing Canadian students with the top performing students in the world. While attending the AERA 2019 Congress (April 5-9) in Toronto, I got a sneak peak at his findings.  While his research study attracted little attention, it will be of considerable interest to all of those committed to maintaining and improving student performance standards.

LakeYoworiew

His University of Calgary study, co-authored with Man-Wai Chu and Yue Xu, laid out the essential facts: The average performance of Canadian students in Mathematics (PISA) has declined since 2006 (OECD, 2007, 2010, 2014, 2016)  Students from three top performing Asian countries, Singapore, Macau-China and Japan, continue to outperform our 15-year-old students by a significant margin.  Furthermore, OECD reports that students in Asian countries (Singapore, Japan, Macao- China and Hong Kong-China) spend more time doing homework and score much higher. It is estimated that they score 17 points or more per extra hour of homework.

Recent North American research seems more alert to the need to study the relationship between homework and academic achievement, particularly in mathematics. A literature review, conducted by Yeworiew, Chu and Xu, demonstrates that, while the findings cut in both directions, the weight of research favours homework. In fact, the Canadian Council of Ministers’ of Education (CMEC 2014) has come down in favour of homework. Based upon Canadian national test surveys (PCAP), CMEC confirms that math achievement of students who do not do homework is significantly lower than those doing regular homework.

Yeworiew and his research team provide further confirmation of this 2014 CMEC assessment. Utilizing the 2015 TIMSS study in Canada, involving 8,757 students and 276 schools in four provinces (Ontario, Quebec, Manitoba and Newfoundland/Labrador), the authors demonstrate the clear value of regular homework in modest amounts.

The research findings are effectively presented in a series of graphs mapping the study results, reprinted here directly from their AERA 2019 Toronto presentation:

 

 

The relationship between homework and achievement is becoming less of a mystery. Based upon the performance of Grade 8 students in the 2015 TIMSS study, short but frequent homework assignments contribute to improved student learning and achievement in mathematics. Frequent homework assignments, up to four times a week, have a positive effect on math achievement, but less sop when it is of longer duration. No discernable differences were detected for girls in relation to boys at the Grade 8 level in Canada.

Why do Canadian researchers produce so few studies like the University of Calgary project attempting to assess the impact of homework on achievement?  To what extent is it because Canadian homework studies tend to focus on psycho-social aspects such as the impact of homework on student attitudes and the opinions of parents?

Are we asking the right questions? “How much is enough?” is surely a sounder line of inquiry than “How do you feel when overburdened with homework? ” What is really accomplished by asking ‘Does homework ad to your anxieties?” Should we be more conscious of the inherent biases in such research questions? 

 

 

 

 

 

 

 

 

Read Full Post »

Student report cards are a critical point of contact with parents and that’s why they attract more critical scrutiny than other aspects of K-12 education.  Most parents seek clear, intelligible, individualized, regular student progress reports with understandable grades, while student assessment consultants come up with wave-after-wave of changes modeling the latest proposed innovation in assessment practice.  That explains, in many ways, why the subterranean issue never seems to disappear.

Every five years or so, school authorities from Canadian province to province attempt to revamp their student report cards, usually aimed at challenging the prevailing orthodoxy. Introducing outcomes-based student assessment in the 1990s produced a new impenetrable language accompanied by “competencies” and hundreds of “micro-outcomes.”  Repeated attempts were made to replace letter grades in elementary schools and percentage marks in high schools with outcomes-based reporting and newly-constructed scales of development in learning. That initial wave produced what have become standardized, digitally-generated provincial or school district report templates.

Most top-down report card modernization plans end up imposing heavier reporting loads on teachers and leaving most parents baffled. Six years ago, Nova Scotia parent Marshall Hamilton spoke for perhaps hundreds of thousands of parents: “I don’t see my child in the comments.”  “The language doesn’t really give the parent or the child any idea of critical feedback,” he explained to CBC News. ” I can probably figure out more about what the curriculum is meant to do than to understand my daughter’s performance in that current curriculum.”

Student report cards in Canadian school systems are, in theory, intended to provide ‘meaningful information” to parents and guardians on “how their child is progressing in school.” Since that wave of parent criticism six years ago, Nova Scotia’s student reports have become far clearer and more intelligible with actual marks from Grade 7 to 12 , but there are still a few missing pieces.

Legitimate concerns about teachers’ classroom conditions and workloads sometimes prompt initiatives to “streamline” reporting that have unintended consequences.  Surveying his daughter’s November 2018 Grade 6 Nova Scotia report, former teacher Kristopher Snarby was surprised to see that it provided no feedback on subject courses representing over half her weekly schedule. Report cards from Grades P to 6, Snarby discovered, only contained marks and comments on Language Arts and Mathematics, providing no marks, comments or attendance for any of her other subjects. The standard provincial report template simply did not fit his daughter’s school, where multiple teachers taught a variety of subjects.

Those report card changes originated back in March 2018 as one of the recommendations to “streamline” November reports from a provincial teacher advisory body, the Council to Improve Classroom Conditions. The problem, as framed by the Council, was “time-consuming” reporting processes and reports that were “confusing to parents.” The solution: reduce “data entry for teachers” and provide “integrated comments” for only two subjects, Language Arts and Math.

Making reports less comprehensive with fewer subject specific comments would never fly with parents who, after all, are the main consumers of those reports. If they were ever asked, they would also likely favour reports with more definitive feedback, including individual student assessment test results in Grades 3, 6 and 8.  Elementary student progress reports that provide feedback on integrated subjects also tend to obscure how students are actually performing in two critical areas, reading and numeracy.

Providing parents with reports including their own child’s provincial assessment scores would remedy that omission. That is not such an outlandish idea when one considers the latest teacher-friendly innovations (Christodoulou 2017) in student assessment reporting. Most North American school authorities are actually providing more and more information not less on both school standards and individual student performance.

Take Ontario, for example. Students in Ontario are all tested in grades 3 and 6 and, while they do not appear on school progress reports, the independent Education Quality and Accountability Office (EQAO) provides parents with a detailed individual report on their child’s progress, benchmarked against provincial student performance standards.

The EQAO individual student report card for Primary Division (Grades 1-3) provides incredibly detailed feedback on reading, writing, and mathematics, reflecting four distinct levels of achievement. It’s also relatively easy to identify how students actually measure up in their performance.

The Grade 9 EQAO math test is a component of the regular school report, accounting for up to 10 per cent of a student’s math mark. Ontario students are also required to pass a Grade 10 Literacy Test or remedial Literacy course to secure a secondary school diploma.

Parents in Ontario are encouraged to work together in partnership with their teachers to improve student learning. “Talk to your child’s teacher,” the EQAO report advises, “about how these results compare to your child’s daily classroom work and assessment information.”

Providing parents with individual student reports on provincial assessment results would be a step forward, but integrating them into Grade 3, 6, and 8 school district reports would be even better. Then parents would be able to see, on one report, how students were performing not only in local schools, but in relation to provincial standards.

What Canadian education needs is more parents like Kristopher Snarby keeping an eye on changes in the system. As a former teacher, he is particularly alert to “teacher-speak” on reports that are “not really intelligible for parents.” “Feedback is critical for parents,” Snarby says, and “that’s why what’s on student reports  really matters.”

Do Student Report Card reforms make matters better – or worse — for parents and students?  Can we find the right balance between providing meaningful, individualized reports while easing teachers’ workloads? What can possibly be wrong with giving teachers more autonomy to make more personal, pointed comments about actual student performance?  Would it be helpful to see both teacher assessments and provincial test results on those reports?  

Read Full Post »

University of Kentucky student assessment guru Thomas R. Guskey is back on the Canadian Professional Development circuit with a new version of what looks very much like Outcomes-Based Education.  It is clear that he has the ear of the current leadership in the Education Department of Prince Edward Island.  For two days in late November 2018, he dazzled a captive audience of over 200 senior Island school administrators with has stock presentations extolling the virtues of mastery learning and competency-based student assessment.

GuskeyThomasSpeakingP.E. I’s Coordinator of Leadership and Learning Jane Hastelow was effusive in her praise for Guskey and his assessment theories. Tweets by educators emanating from the Guskey sessions parroted the gist of his message. “Students don’t always learn at the same rate or in the same order,” Guskey told the audience. So, why do we teach them in grades, award marks, and promote them in batches?

Grading students and assigning marks, according to Guskey, can have detrimental effects on children. “No research,” he claims, “supports the idea that low grades prompt students to try harder. More often, low grades lead students to withdraw from learning.”

Professional learning, in Guskey’s world, should be focused not on cognitive or knowledge-based learning, but on introducing “mastery learning” as a way of advancing “differentiated instruction” classrooms. “High-quality corrective instruction,” he told P.E.I. educators, is not the same as ‘re-teaching.’” It is actually a means of training teachers to adopt new approaches that “accommodate differences in students’ learning styles, learning modalities, or types of intelligence.”.

Guskey is well-known in North American education as the chief proponent for the elimination of percentage grades.  For more than two decades, in countless PD presentations, he has promoted his own preferred brand of student assessment reform. “It’s time, “ he insists, “ to abandon grading scales that distort the accuracy, objectivity and reliability of students’ grades.”

Up and coming principals and curriculum leads, most without much knowledge of assessment, have proven to be putty in his hands. If so, what’s the problem?   Simply put, Dr. Guskey’s theories, when translated into student evaluation policy and reporting, generate resistance among engaged parents looking for something completely different – clearer, understandable, jargon-free student reports with real marks. Classroom teachers soon come to realize that the new strategies and rubrics are far more complicated and time-consuming, often leaving them buried in additional workload.

Guskey’s student assessment theories do appeal to school administrators who espouse progressive educational principles. He specializes in promoting competency-based education grafted onto student-centred pedagogy or teaching methods.

Most regular teachers today are only too familiar with top-down reform designed to promote “assessment for learning” (AfL) and see, first hand, how it has led to the steady erosion of teacher autonomy in the classroom.

While AfL is a sound assessment philosophy, pioneered by the leading U.K. researcher Dylan Wiliam since the mid-1990s, it has proven difficult to implement. Good ideas can become discredited by poor implementation, especially when formative assessment becomes just another vehicle for a new generation of summative assessment used to validate standards.

Education leaders entranced by Guskey’s theories rarely delve into where it all leads for classroom teachers.  In Canada, it took the “no zeros” controversy sparked in May 2012 by Alberta teacher Lynden Dorval to bring the whole dispute into sharper relief. As a veteran high school Physics teacher, Dorval resisted his Edmonton high school’s policy which prevented him from assigning zeros when students, after repeated reminders, failed to produce assignments or appear for make-up tests.

Teachers running smack up against such policies learn that the ‘research’ supporting “no zeros” policy can be traced back to an October 2004 Thomas Guskey article in the Principal Leadership magazine entitled “Zero Alternatives.”

Manitoba social studies teacher Michael Zwaagstra analyzed Guskey’s research and found it wanting.  His claim that awarding zeros was a questionable practice rested on a single 20-year-old opinion-based presentation by an Oregon English teacher to the 1993 National Middle School conference. Guskey’s subsequent books either repeat that reference or simply restate his hypothesis as an incontestable truth.

SpadyWilliamOBEGuskey’s theories are certainly not new. Much of the research dates back to the early 1990s and the work of William Spady, a Mastery Learning theorist known as the prime architect of the ill-fated Outcomes-Based Education (OBE) movement.  OBE was best exemplified by the infamous mind-boggling systematized report cards loaded with hundreds of learning outcomes, and it capsized in in the early 2000s. in the wake of a storm of public and professional opposition in Pennsylvania and a number of other states.

The litmus test for education reform initiatives is now set at a rather low bar – “do no harm” to teachers or students.  What Thomas Guskey is spouting begs for more serious investigation. One red flag is his continued reference to “learning styles” and “multiple intelligences,” two concepts that do not exist and are now considered abandoned theories.

Guskey’s student assessment theories fly mostly in the face of the weight of recent research, including that of Dylan Wiliam.  Much of the best research is synthesized in Daisy Christodoulou’s 2014 book, Making Good Progress. Such initiatives float on unproven theories, lack supporting evidence-based research, chip away at teacher autonomy, and leave classroom practitioners snowed under with heavier ‘new age’ marking loads.

A word to the wise for  P.E.I. Education leadership – look closely before you leap. Take a closer look at the latest research on teacher-driven student assessment and why OBE was rejected twenty years ago by classroom teachers and legions of skeptical parents.

What’s really new about Dr. Thomas Guskey’s latest project known as Competency-Based Assessment? What is its appeal for classroom teachers concerned about time-consuming, labour-intensive assessment schemes?  Will engaged and informed parents ever accept the elimination of student grades? Where’s the evidence-based research to support changes based upon such untested theories? 

Read Full Post »

Where you live can greatly influence on the educational outcomes of your children. Some education observers go so far as to say: “The quality of education is determined by your postal code.” In school systems with strict student attendance zones, it is, for all intents and purposes, the iron law of public education.

Students, whatever their background, can overcome significant disadvantages. ““Your destiny is in your hands, and don’t you forget that,” as former U.S. President Barack Obama said famously in July 2009. “That’s what we have to teach all of our children! No excuses! No excuses!”

ClosingtheGapCHClassPhotoThere is a fine line between identifying struggling schools and ‘labeling’ them.  “We identify schools and where they are on the improvement journey,” says Elwin LeRoux,, Regional Director of Education in Halifax, Nova Scotia. “Yet we are careful not to ‘label’ some schools in ways that may carry negative connotations and influence student attitudes.”

How a school district identifies struggling schools and how it responds is what matters. Accepting the socio-economic dictates or ignoring the stark realities is not good enough. It only serves to reinforce the ingrained assumption, contribute to lowered academic expectations, and possibly adversely affect school leadership, student behaviour standards, teacher attitudes, and parent-school relations.

While there are risks involved is comparing school performance, parents and the public are entitled to know more about how students in our public schools are actually performing. The Halifax Chronicle Herald broke the taboo in November 2018 and followed the path blazed by other daily papers, including The Globe and Mail and the Hamilton Spectator, in providing a school-by-school analysis of school performance in relation to socio-economic factors influencing student success. The series was based upon extensive research conducted for the Atlantic Institute of Market Studies (AIMS). 

A Case Study – the Halifax Public School System

The Halifax Regional Centre for Education (formerly the Halifax Regional School Board) enrolls 47,770 students in 135 schools, employs 4,000 school-based teachers,and provides a perfect lens through which to tackle the whole question. Student achievement and attainment results over the past decade, from 2008-09 to 2015-16, have been published in school-by school community reports and, when aggregated, provide clear evidence of how schools are actually performing in Halifax Region.

Unlike many Canadian boards, the HRCE is organized in an asymmetrical fashion with a mixed variety of organizational units: elementary schools (84), junior high/middle schools (27), senior elementary (7), P-12 academy (1), junior-senior high schools (6), and senior high schools (10).   Current student enrolment figures, by school division, stand at 25,837 for Primary to Grade 6, 11,245 for Grades 7 to 9, and 10,688 for Grades 10 to 12.

Student Achievement and School Improvement

Since November of 2009, the Halifax system has been more open and transparent in reporting on student assessment results as a component of its system-wide improvement plan. Former Superintendent Carole Olsen introduced the existing accountability system along with a new mission that set a far more specific goal: “Every Student will Learn, every School will Improve.”

HRSBGoodtoGreatCollageThe Superintendent’s 2008-09 report was introduced with great fanfare with an aspirational goal of transforming “Good Schools to Great Schools” and a firm system-wide commitment that “every school, by 2013, will demonstrate improvement in student learning.” Following the release of aggregated board-wide data, the HRSB produced school-by-school accountability reports, made freely available to not only the School Advisory Councils (SACs), but to all parents in each school.

Superintendent Olsen set out what she described as “a bold vision” to create “a network of great schools” in “thriving communities” that “bring out the best in us.” School-by-school reporting was critical to that whole project. “Knowing how each school is doing is the first important step in making sure resources and support reach the schools – and the students—that need them the most,” Olsen declared.

The Established Benchmark – School Year 2008-09

The school year 2008-09, the first year in the HRSB’s system-wide improvement initiative, provided the benchmark, not only for the board, but for the AIMS research report taking stock of student achievement and school-by-school performance over the past decade.

In 2008-09, the first set of student results in the two core competencies, reading and math, demonstrated that HRSB student scores were comparable to other Canadian school systems, but there was room for improvement. In Grade 2 reading, the system-wide target was that 77 per cent of all students would meet established board standards. Only 25 out of some 91 schools (27.5 %) met or exceeded the established target.

While Grade 2 and Grade 5 Mathematics students performed better, problems surfaced at the Grade 8 level, where two out of three schools (67.5 %) failed to meet the HRSB standard. High numbers of Grade 8 students were struggling with measurement, whole number operations (multiplication, division), problem-solving, and communication.

System Leadership Change and Policy Shifts

Schools in the Halifax school system may have exceeded the initial public expectations, but the vast majority of those schools fell far short of moving from “Good Schools to Great Schools.” Some gains were made in student success rates in the two core competencies, reading and mathematics, by the 2013 target year, but not enough to match the aspirational goals set by Superintendent Olsen and the elected school board.

HRSBElwinLeRoux

With Olsen’s appointment in September 2012 as Deputy Minister of Education for Nova Scotia, the robust HRSB commitment to school-by-school improvement and demonstrably improved standards in reading and mathematics faltered. Her successor, LeRoux, a 24-year board veteran, espoused more modest goals and demonstrated a more collegial, low-key leadership style. Without comprehensive school system performance reports, the school community reports, appended routinely as PDFs to school websites, attracted little attention.

The “Good Schools to Great Schools” initiative had failed to work miracles. That became apparent in May 2014, following the release of the latest round of provincial literacy assessments.  The formal report to the Board put it bluntly: “A large achievement gap exists between overall board results and those students who live in poverty.”

School administration, based upon research conducted in-house by psychologist Karen Lemmon, identified schools in need of assistance when more than one-third of the family population in a school catchment could be classified as “low income” households. Twenty of its 84 elementary schools were identified and designated as “Priority Schools” requiring more attention, enhanced resources, and extra support programs to close the student achievement gap.

The focus changed, once again, following the release of the 2017-18 provincial results in Grade 6 Math and Literacy. Confronted with those disappointing results, the HRSB began to acknowledge that students living in poverty came disproportionately from marginalized communities.

Instead of focusing broadly on students in poverty, the Board turned its attention to the under-performance of Grade 6 students from African/black and Mi’kmaq/Indigenous communities. For students of African ancestry, for example, the Grade 6 Mathematics scores declined by 6 per cent, leaving less than half (49 per cent) meting provincial standards. What started out as a school improvement project focused on lower socioeconomic schools had evolved into one addressing differences along ethno-racial lines.

Summaries of the AIMS Research Report Findings

Stark Inequalities – High Performing and Struggling Schools

Hopeful Signs – Most Improved Schools

Summation and Recommendations – What More Can Be Done?

Putting the Findings in Context

School-by-school comparative studies run smack up against the hard realities of the socio-economic context affecting children’s lives and their school experiences.  All public schools from Pre-Primary to Grade 12 are not created equal and some enjoy advantages that far exceed others, while others, in disadvantaged communities, struggle to retain students and are unable, given the conditions, to move the needle in school improvement. So, what can be done to break the cycle?

Questions for Discussion

Comparing school-by-school performance over the past decade yields some startling results and raises a few critical questions:  Is the quality of your education largely determined by your postal code in Canadian public school systems? What are the dangers inherent in accepting the dictates of socio-economic factors with respect to student performance?  What overall strategies work best in breaking the cycle of stagnating improvement and chronic under-performance? Should school systems be investing less in internal “learning supports” and more in rebuilding school communities themselves? 

Read Full Post »

The latest student achievement results, featured in the April 30, 2018 Pan-Canadian Assessment Program (PCAP) 2016 report, prove, once again, how system-critical testing is for K-12 education. Students in every Canadian province except Ontario saw gains in Grade 8 student scores from 2010 to 2016 and we are now much the wiser. That educational reality check simply confirms that it’s no time to be jettisoning Ontario’s Grade 3 provincial tests and chipping away at the reputation of the province’s independent testing agency, the Education Quality and Accountability Office (EQAO).

The plan to end Grade 3 provincial testing arrived with the final report of Ontario: A Learning Province, produced by OISE professor Carol Campbell and her team of six supposedly independent advisors, including well-known change theorists Michael Fullan, Andy Hargreaves and Jean Clinton. Targeting of the EQAO was telegraphed in an earlier discussion paper, but the consultation phase focused ostensibly more on “broadening measures of student success” beyond achievement and into the largely uncharted realm of “social and emotional learning” (SEL).

The final report stunned many close observers in Ontario who expected much more from the review, and, in particular, an SEL framework for assessment and a new set of “student well- being” reports for the 2018-19 school year.  Tampering with Grade 3 testing made former Ontario Deputy Minister Charles Pascal uncomfortable because it interfered with diagnosis for early interventions.

It also attracted a stiff rebuke from the world’s leading authority on formative assessment, British assessment specialist Dylan Wiliam. He was not impressed at all with the Campbell review committee report. While it was billed as a student assessment review, Wiliam noted that none of the committee members is known for expertise in assessment, testing or evaluation.

Education insiders were betting that the Kathleen Wynne Liberal-friendly review team would simply unveil the plan for “broader student success” developed by Annie Kidder and her People for Education lobby group since 2012 and known as the “Measuring What Matters” project. It is now clear that something happened to disrupt the delivery of that carefully nurtured policy baby. Perhaps the impending Ontario provincial election was a factor.

Social and emotional learning is now at the very core of Ontario’s Achieving Excellence and Equity agenda and it fully embraces “supporting all students” and enabling them to achieve “a positive sense of well-being – the sense of self, identity, and belonging in the world that will help them to learn, grow and thrive.”

The Ontario model, hatched by the Education Ministry in collaboration with People for Education, is based upon a psycho-social theory that “well-being” has “four interconnected elements” critical to student development, with self/spirit at the centre. The whole formulation reflects the biases of the architects, since grit, growth mindset, respect and responsibility are nowhere to be found in the preferred set of social values inculcated in the system. Whatever the rationale, proceeding to integrate SEL into student reports and province-wide assessments is premature when recognized American experts Angela Duckworth and David Scott Yeager warn that the ‘generic skills’ are ill- defined and possibly unmeasureable.

Evidence-informed researchers such as Daisy Christodoulou, author of Making Good Progress (2017), do not support the proposed change in Ontario student assessment focus. Generic or transferable skills approaches such as Ontario is considering generate generic feedback of limited value to students in the classroom. Relying too heavily on teacher assessments is unwise because, as Christodoulou reminds us, disadvantaged students tend to fare better on larger-scale, objective tests. The proposed prose descriptors will, in all likelihood, be jargon-ridden, unintelligible to students and parents, and prove particularly inaccessible to students struggling in school.

One of the reasons Ontario has been recognized as a leading education system is because of its success over the past 20 years in establishing an independent EQAO with an established and professionally-sound provincial testing program in Grades 3, 6, and 9 and a Grade 10 literacy test that needs improvement. Legitimate teacher concerns about changes that increase marking loads do need to be addressed in any new student assessment plan and so do objections over the fuzzy, labour-intensive SEL student reports.

The proposal to phase out Ontario provincial testing may already be dead in the water.  If it is, you can guess that the April 30, 2018 editorial in The Toronto Star was definitely a contributing factor.  If the Wynne Liberals go down to defeat in the June 2018 election, the whole plan will likely be shelved or completely revamped by a new government.

Whether you support the EQAO or not, the agency has succeeded in establishing reliable quality standards for student performance in literacy and mathematics. Abandoning Grade 3 testing and gutting the EQAO is not only ill-conceived, but ill advised. Without the PCAP and provincial achievement benchmarks we would be flying blind into the future.

What can possibly be gained from eliminating system-wide Grade 3 provincial assessments?  How does that square with research suggesting early assessments are critical in addressing reading and numeracy difficulties?  Without Ontario, would it be possible to conduct comprehensive Grade 3 bench-marking across Canada?  If staff workload is the problem, then aren’t there other ways to address that matter?  And whatever happened to the proposed Social and Emotional Learning (SEL) assessments and reports? 

Read Full Post »

Ontario now aspires to global education leadership in the realm of student evaluation and reporting. The latest Ontario student assessment initiative, A Learning Province, announced in September 2017 and guided by OISE education  professor Dr. Carol Campbell, cast a wide net encompassing classroom assessments, large scale provincial tests, and national/international assessment programs.  That vision for “student-centred assessments” worked from the assumption that future assessments would capture the totality of “students’ experiences — their needs, learning, progress and well-being.”

The sheer scope whole project not only deserves much closer scrutiny, but needs to be carefully assessed for its potential impact on frontline teachers. A pithy statement by British teacher-researcher Daisy Christodoulou in January 2017 is germane to the point: “When government get their hands on anything involving the word ‘assessment’, they want it to be about high stakes monitoring and tracking, not about low-stakes diagnosis.”  In the case of  Ontario, pursuing the datafication of social-emotional-learning and the mining of data to produce personality profiles is clearly taking precedence over the creation of teacher-friendly assessment policy and practices.

One of the reasons Ontario has been recognized as a leading education system is because of its success over the past 20 years in establishing an independent Education Quality and Accountability Office  (EQAO) with an established and professionally-sound provincial testing program in Grades 3, 6, 9 and 10.  Whether you support the EQAO or not, most agree that is has succeeded in establishing reliable benchmark standards for student performance in literacy and mathematics.

The entire focus of Ontario student assessment is now changing. Heavily influenced by the Ontario People for Education Measuring What Matters project, the province is plunging ahead with Social and Emotional Learning (SEL) assessment embracing what Ben Williamson aptly describes as “stealth assessment” – a set of contested personality criteria utilizing SEL ‘datafication’ to measure “student well-being.” Proceeding to integrate SEL into student reports and province-wide assessments is also foolhardy when American experts Angela Duckworth and David Scott Yeager warn that the ‘generic skills’ are ill- defined and possibly unmeasureable.

Social and emotional learning is now at the very core of Ontario’s Achieving Excellence and Equity agenda and it fully embraces “supporting all students” and enabling them to achieve “a positive sense of well-being – the sense of self, identity, and belonging in the world that will help them to learn, grow and thrive.” The Ontario model is based upon a psycho-social theory that “well-being” has “four interconnected elements” critical to student development, with self/spirit at the centre. Promoting student well-being is about fostering learning environments exhibiting these elements:

Cognitive: Development of abilities and skills such as critical thinking, problem solving, creativity, and the ability to be flexible and innovative.

Emotional: Learning about experiencing emotions, and understanding how to recognize, manage, and cope with them.

Social: Development of self-awareness, including the sense of belonging, collaboration, relationships with others, and communication skills.

Physical: Development of the body, impacted by physical activity, sleep patterns, healthy eating, and healthy life choices.

Self/Spirit:  Recognizing the core of identity whieh has “different meanings for different people, and can include cultural heritage, language, community, religion or a broader spirituality.”

Ontario’s new student report cards, proposed for 2018-19 implementation, will incorporate an distinct SEL component with teacher evaluations on a set of “transferable skills” shifting the focus from organization and work habits to “well-being” and associated values, while retaining grades or marks for individual classes. The Ontario Education “Big Six” Transferable Skills are: critical thinking, innovation and creativity, self-directed learning, collaboration, communication, and citizenship.  Curiously absent from the Ontario list of preferred skills are those commonly found in American variations on the formula: grit, growth mindset, and character

The emerging Ontario student assessment strategy needs to be evaluated in relation to the latest research and best practice, exemplified in Dylan Wiliam’s student assessment research and Daisy Christodoulou’s 2017 book Making Good Progress: The Future of Assessment for Learning.  Viewed through that lens, the Ontario student assessment philosophy and practice falls short on a number of counts.

  1. The Generic Skills Approach: Adopting this approach reflects a fundamental misunderstanding about how students learn and acquire meaningful skills. Tacking problem-solving at the outset, utilizing Project-Based Learning to “solve-real life problems” is misguided  because knowledge and skills are better acquired  through other means. The “deliberate practice method” has proven more effective. Far more is learned when students break down skills into a ‘progression of understanding’ — acquiring the knowledge and skill to progress on to bigger problems.
  2. Generic Feedback: Generic or transferable skills prove to be unsound when used as a basis for student reporting and feedback on student progress. Skills are not taught in the abstract, so feedback has little meaning for students. Reading a story and making inferences, for example, is not a discrete skill; it is dependent upon knowledge of vocabulary and background context to achieve reading comprehension.
  3. Hidden Bias of Teacher Assessment: Teacher classroom assessments are highly desirable, but do not prove as reliable as standardized measures administered under fair and objective conditions. Disadvantaged students, based upon reliable, peer-reviewed research, do better on tests than of regular teacher assessments. “Teacher assessment is biased not because they are carried out by teachers, but because it is carried out by humans.”
  4. Unhelpful Prose Descriptors: Most verbal used in system-wide assessments and reports are unhelpful — tend to be jargon-ridden, unintelligible to students and parents, and prove particularly inaccessible to students struggling in school. Second generation descriptors are “pupil friendly” but still prove difficult to use in learning how to improve or correct errors.
  5. Work-Generating Assessments: System-wide assessments, poorly constructed, generate unplanned and unexpected marking loads, particularly in the case of qualitative assessments with rubrics or longer marking time. In the U.K., for example, the use of grade descriptors for feedback proved much more time consuming than normal grading of written work Primary teachers who spent 5 hours a week on assessment in 2010, found that, by 2013, they were spending 10 hours a week.AssessmentMarkLoadCrisisWhat’s wrong with the new Ontario Assessment Plan and needs rethinking?
  1. The Generic Skills Approach – Teaching generic skills (SEL) doesn’t work and devalues domain-specific knowledge
  2. Social and Emotional Learning (SEL) models — carry inherent biases and are unmeasurable
  3. Breach of Student Security – Data mining and student surveys generate personality data without consent
  4. Erosion of Teacher Autonomy – Student SEL data generated by algorithms, creates more record-keeping, more marking, cuts into classroom time.

The best evidence-based assessment research, applied in deconstructing the Ontario Assessment initiative, raises red flags.  Bad student assessment practices, as Wiliam and Christodoulou show, can lead to serious workload problems for classroom teachers. No education jurisdiction that lived up to the motto “Learning Province” would plow ahead when the light turns to amber.

A summary of the researchED Ontario presentation delivered April 14, 2018, at the Toronto Airport Westin Hotel. 

Where is the new Ontario student assessment initiative really heading? Is it a thinly-disguised attempt to create a counterweight to current large-scale student achievement assessments? Is it feasible to proceed with SEL assessment when leading researchers question its legitimacy and validity? Are we running the risk of opening the door to the wholesale mining of student personal information without consent and for questionable purposes? 

Read Full Post »

Older Posts »