Feeds:
Posts
Comments

Archive for the ‘Student Assessment’ Category

The Homework Debate never seems to go away.  Popular books and articles inspired initially by American education writer Alfie Kohn and his Canadian disciples continue to beat the drum for easing the homework burden on students or eliminating homework altogether before the secondary school level. That “No Homework” movement made significant inroads in the United States and Canada during the 2000’s. The Organization for Economic Cooperation and Development (OECD), responsible for the Program of International Assessment (PISA) test, confirmed that the amount of time students in North America spend on doing homework had declined, as of the 2014 assessment year.

HomeworkHackItHomeworkCaseAgainst2006

 

A critical question needs to be asked: Has the “No Homework” movement and the apparent push-back against homework had an adverse effect on student achievement? That’s difficult to answer because, despite the critical importance of the issue and the long history of homework research, few North American researchers have been inclined to study the role that homework plays in enhancing student achievement, even in mathematics.

One little-known researcher, Lake B. Yeworiew, an Ethiopian scholar, based at the University of Calgary, but recently-arrived in Canada, saw the hole in the research and recently tackled the whole question. His focus was on assessing the relationship between homework and Grade 8 Mathematics student achievement, comparing Canadian students with the top performing students in the world. While attending the AERA 2019 Congress (April 5-9) in Toronto, I got a sneak peak at his findings.  While his research study attracted little attention, it will be of considerable interest to all of those committed to maintaining and improving student performance standards.

LakeYoworiew

His University of Calgary study, co-authored with Man-Wai Chu and Yue Xu, laid out the essential facts: The average performance of Canadian students in Mathematics (PISA) has declined since 2006 (OECD, 2007, 2010, 2014, 2016)  Students from three top performing Asian countries, Singapore, Macau-China and Japan, continue to outperform our 15-year-old students by a significant margin.  Furthermore, OECD reports that students in Asian countries (Singapore, Japan, Macao- China and Hong Kong-China) spend more time doing homework and score much higher. It is estimated that they score 17 points or more per extra hour of homework.

Recent North American research seems more alert to the need to study the relationship between homework and academic achievement, particularly in mathematics. A literature review, conducted by Yeworiew, Chu and Xu, demonstrates that, while the findings cut in both directions, the weight of research favours homework. In fact, the Canadian Council of Ministers’ of Education (CMEC 2014) has come down in favour of homework. Based upon Canadian national test surveys (PCAP), CMEC confirms that math achievement of students who do not do homework is significantly lower than those doing regular homework.

Yeworiew and his research team provide further confirmation of this 2014 CMEC assessment. Utilizing the 2015 TIMSS study in Canada, involving 8,757 students and 276 schools in four provinces (Ontario, Quebec, Manitoba and Newfoundland/Labrador), the authors demonstrate the clear value of regular homework in modest amounts.

The research findings are effectively presented in a series of graphs mapping the study results, reprinted here directly from their AERA 2019 Toronto presentation:

 

 

The relationship between homework and achievement is becoming less of a mystery. Based upon the performance of Grade 8 students in the 2015 TIMSS study, short but frequent homework assignments contribute to improved student learning and achievement in mathematics. Frequent homework assignments, up to four times a week, have a positive effect on math achievement, but less sop when it is of longer duration. No discernable differences were detected for girls in relation to boys at the Grade 8 level in Canada.

Why do Canadian researchers produce so few studies like the University of Calgary project attempting to assess the impact of homework on achievement?  To what extent is it because Canadian homework studies tend to focus on psycho-social aspects such as the impact of homework on student attitudes and the opinions of parents?

Are we asking the right questions? “How much is enough?” is surely a sounder line of inquiry than “How do you feel when overburdened with homework? ” What is really accomplished by asking ‘Does homework ad to your anxieties?” Should we be more conscious of the inherent biases in such research questions? 

 

 

 

 

 

 

 

 

Advertisements

Read Full Post »

Student report cards are a critical point of contact with parents and that’s why they attract more critical scrutiny than other aspects of K-12 education.  Most parents seek clear, intelligible, individualized, regular student progress reports with understandable grades, while student assessment consultants come up with wave-after-wave of changes modeling the latest proposed innovation in assessment practice.  That explains, in many ways, why the subterranean issue never seems to disappear.

Every five years or so, school authorities from Canadian province to province attempt to revamp their student report cards, usually aimed at challenging the prevailing orthodoxy. Introducing outcomes-based student assessment in the 1990s produced a new impenetrable language accompanied by “competencies” and hundreds of “micro-outcomes.”  Repeated attempts were made to replace letter grades in elementary schools and percentage marks in high schools with outcomes-based reporting and newly-constructed scales of development in learning. That initial wave produced what have become standardized, digitally-generated provincial or school district report templates.

Most top-down report card modernization plans end up imposing heavier reporting loads on teachers and leaving most parents baffled. Six years ago, Nova Scotia parent Marshall Hamilton spoke for perhaps hundreds of thousands of parents: “I don’t see my child in the comments.”  “The language doesn’t really give the parent or the child any idea of critical feedback,” he explained to CBC News. ” I can probably figure out more about what the curriculum is meant to do than to understand my daughter’s performance in that current curriculum.”

Student report cards in Canadian school systems are, in theory, intended to provide ‘meaningful information” to parents and guardians on “how their child is progressing in school.” Since that wave of parent criticism six years ago, Nova Scotia’s student reports have become far clearer and more intelligible with actual marks from Grade 7 to 12 , but there are still a few missing pieces.

Legitimate concerns about teachers’ classroom conditions and workloads sometimes prompt initiatives to “streamline” reporting that have unintended consequences.  Surveying his daughter’s November 2018 Grade 6 Nova Scotia report, former teacher Kristopher Snarby was surprised to see that it provided no feedback on subject courses representing over half her weekly schedule. Report cards from Grades P to 6, Snarby discovered, only contained marks and comments on Language Arts and Mathematics, providing no marks, comments or attendance for any of her other subjects. The standard provincial report template simply did not fit his daughter’s school, where multiple teachers taught a variety of subjects.

Those report card changes originated back in March 2018 as one of the recommendations to “streamline” November reports from a provincial teacher advisory body, the Council to Improve Classroom Conditions. The problem, as framed by the Council, was “time-consuming” reporting processes and reports that were “confusing to parents.” The solution: reduce “data entry for teachers” and provide “integrated comments” for only two subjects, Language Arts and Math.

Making reports less comprehensive with fewer subject specific comments would never fly with parents who, after all, are the main consumers of those reports. If they were ever asked, they would also likely favour reports with more definitive feedback, including individual student assessment test results in Grades 3, 6 and 8.  Elementary student progress reports that provide feedback on integrated subjects also tend to obscure how students are actually performing in two critical areas, reading and numeracy.

Providing parents with reports including their own child’s provincial assessment scores would remedy that omission. That is not such an outlandish idea when one considers the latest teacher-friendly innovations (Christodoulou 2017) in student assessment reporting. Most North American school authorities are actually providing more and more information not less on both school standards and individual student performance.

Take Ontario, for example. Students in Ontario are all tested in grades 3 and 6 and, while they do not appear on school progress reports, the independent Education Quality and Accountability Office (EQAO) provides parents with a detailed individual report on their child’s progress, benchmarked against provincial student performance standards.

The EQAO individual student report card for Primary Division (Grades 1-3) provides incredibly detailed feedback on reading, writing, and mathematics, reflecting four distinct levels of achievement. It’s also relatively easy to identify how students actually measure up in their performance.

The Grade 9 EQAO math test is a component of the regular school report, accounting for up to 10 per cent of a student’s math mark. Ontario students are also required to pass a Grade 10 Literacy Test or remedial Literacy course to secure a secondary school diploma.

Parents in Ontario are encouraged to work together in partnership with their teachers to improve student learning. “Talk to your child’s teacher,” the EQAO report advises, “about how these results compare to your child’s daily classroom work and assessment information.”

Providing parents with individual student reports on provincial assessment results would be a step forward, but integrating them into Grade 3, 6, and 8 school district reports would be even better. Then parents would be able to see, on one report, how students were performing not only in local schools, but in relation to provincial standards.

What Canadian education needs is more parents like Kristopher Snarby keeping an eye on changes in the system. As a former teacher, he is particularly alert to “teacher-speak” on reports that are “not really intelligible for parents.” “Feedback is critical for parents,” Snarby says, and “that’s why what’s on student reports  really matters.”

Do Student Report Card reforms make matters better – or worse — for parents and students?  Can we find the right balance between providing meaningful, individualized reports while easing teachers’ workloads? What can possibly be wrong with giving teachers more autonomy to make more personal, pointed comments about actual student performance?  Would it be helpful to see both teacher assessments and provincial test results on those reports?  

Read Full Post »

University of Kentucky student assessment guru Thomas R. Guskey is back on the Canadian Professional Development circuit with a new version of what looks very much like Outcomes-Based Education.  It is clear that he has the ear of the current leadership in the Education Department of Prince Edward Island.  For two days in late November 2018, he dazzled a captive audience of over 200 senior Island school administrators with has stock presentations extolling the virtues of mastery learning and competency-based student assessment.

GuskeyThomasSpeakingP.E. I’s Coordinator of Leadership and Learning Jane Hastelow was effusive in her praise for Guskey and his assessment theories. Tweets by educators emanating from the Guskey sessions parroted the gist of his message. “Students don’t always learn at the same rate or in the same order,” Guskey told the audience. So, why do we teach them in grades, award marks, and promote them in batches?

Grading students and assigning marks, according to Guskey, can have detrimental effects on children. “No research,” he claims, “supports the idea that low grades prompt students to try harder. More often, low grades lead students to withdraw from learning.”

Professional learning, in Guskey’s world, should be focused not on cognitive or knowledge-based learning, but on introducing “mastery learning” as a way of advancing “differentiated instruction” classrooms. “High-quality corrective instruction,” he told P.E.I. educators, is not the same as ‘re-teaching.’” It is actually a means of training teachers to adopt new approaches that “accommodate differences in students’ learning styles, learning modalities, or types of intelligence.”.

Guskey is well-known in North American education as the chief proponent for the elimination of percentage grades.  For more than two decades, in countless PD presentations, he has promoted his own preferred brand of student assessment reform. “It’s time, “ he insists, “ to abandon grading scales that distort the accuracy, objectivity and reliability of students’ grades.”

Up and coming principals and curriculum leads, most without much knowledge of assessment, have proven to be putty in his hands. If so, what’s the problem?   Simply put, Dr. Guskey’s theories, when translated into student evaluation policy and reporting, generate resistance among engaged parents looking for something completely different – clearer, understandable, jargon-free student reports with real marks. Classroom teachers soon come to realize that the new strategies and rubrics are far more complicated and time-consuming, often leaving them buried in additional workload.

Guskey’s student assessment theories do appeal to school administrators who espouse progressive educational principles. He specializes in promoting competency-based education grafted onto student-centred pedagogy or teaching methods.

Most regular teachers today are only too familiar with top-down reform designed to promote “assessment for learning” (AfL) and see, first hand, how it has led to the steady erosion of teacher autonomy in the classroom.

While AfL is a sound assessment philosophy, pioneered by the leading U.K. researcher Dylan Wiliam since the mid-1990s, it has proven difficult to implement. Good ideas can become discredited by poor implementation, especially when formative assessment becomes just another vehicle for a new generation of summative assessment used to validate standards.

Education leaders entranced by Guskey’s theories rarely delve into where it all leads for classroom teachers.  In Canada, it took the “no zeros” controversy sparked in May 2012 by Alberta teacher Lynden Dorval to bring the whole dispute into sharper relief. As a veteran high school Physics teacher, Dorval resisted his Edmonton high school’s policy which prevented him from assigning zeros when students, after repeated reminders, failed to produce assignments or appear for make-up tests.

Teachers running smack up against such policies learn that the ‘research’ supporting “no zeros” policy can be traced back to an October 2004 Thomas Guskey article in the Principal Leadership magazine entitled “Zero Alternatives.”

Manitoba social studies teacher Michael Zwaagstra analyzed Guskey’s research and found it wanting.  His claim that awarding zeros was a questionable practice rested on a single 20-year-old opinion-based presentation by an Oregon English teacher to the 1993 National Middle School conference. Guskey’s subsequent books either repeat that reference or simply restate his hypothesis as an incontestable truth.

SpadyWilliamOBEGuskey’s theories are certainly not new. Much of the research dates back to the early 1990s and the work of William Spady, a Mastery Learning theorist known as the prime architect of the ill-fated Outcomes-Based Education (OBE) movement.  OBE was best exemplified by the infamous mind-boggling systematized report cards loaded with hundreds of learning outcomes, and it capsized in in the early 2000s. in the wake of a storm of public and professional opposition in Pennsylvania and a number of other states.

The litmus test for education reform initiatives is now set at a rather low bar – “do no harm” to teachers or students.  What Thomas Guskey is spouting begs for more serious investigation. One red flag is his continued reference to “learning styles” and “multiple intelligences,” two concepts that do not exist and are now considered abandoned theories.

Guskey’s student assessment theories fly mostly in the face of the weight of recent research, including that of Dylan Wiliam.  Much of the best research is synthesized in Daisy Christodoulou’s 2014 book, Making Good Progress. Such initiatives float on unproven theories, lack supporting evidence-based research, chip away at teacher autonomy, and leave classroom practitioners snowed under with heavier ‘new age’ marking loads.

A word to the wise for  P.E.I. Education leadership – look closely before you leap. Take a closer look at the latest research on teacher-driven student assessment and why OBE was rejected twenty years ago by classroom teachers and legions of skeptical parents.

What’s really new about Dr. Thomas Guskey’s latest project known as Competency-Based Assessment? What is its appeal for classroom teachers concerned about time-consuming, labour-intensive assessment schemes?  Will engaged and informed parents ever accept the elimination of student grades? Where’s the evidence-based research to support changes based upon such untested theories? 

Read Full Post »

Where you live can greatly influence on the educational outcomes of your children. Some education observers go so far as to say: “The quality of education is determined by your postal code.” In school systems with strict student attendance zones, it is, for all intents and purposes, the iron law of public education.

Students, whatever their background, can overcome significant disadvantages. ““Your destiny is in your hands, and don’t you forget that,” as former U.S. President Barack Obama said famously in July 2009. “That’s what we have to teach all of our children! No excuses! No excuses!”

ClosingtheGapCHClassPhotoThere is a fine line between identifying struggling schools and ‘labeling’ them.  “We identify schools and where they are on the improvement journey,” says Elwin LeRoux,, Regional Director of Education in Halifax, Nova Scotia. “Yet we are careful not to ‘label’ some schools in ways that may carry negative connotations and influence student attitudes.”

How a school district identifies struggling schools and how it responds is what matters. Accepting the socio-economic dictates or ignoring the stark realities is not good enough. It only serves to reinforce the ingrained assumption, contribute to lowered academic expectations, and possibly adversely affect school leadership, student behaviour standards, teacher attitudes, and parent-school relations.

While there are risks involved is comparing school performance, parents and the public are entitled to know more about how students in our public schools are actually performing. The Halifax Chronicle Herald broke the taboo in November 2018 and followed the path blazed by other daily papers, including The Globe and Mail and the Hamilton Spectator, in providing a school-by-school analysis of school performance in relation to socio-economic factors influencing student success. The series was based upon extensive research conducted for the Atlantic Institute of Market Studies (AIMS). 

A Case Study – the Halifax Public School System

The Halifax Regional Centre for Education (formerly the Halifax Regional School Board) enrolls 47,770 students in 135 schools, employs 4,000 school-based teachers,and provides a perfect lens through which to tackle the whole question. Student achievement and attainment results over the past decade, from 2008-09 to 2015-16, have been published in school-by school community reports and, when aggregated, provide clear evidence of how schools are actually performing in Halifax Region.

Unlike many Canadian boards, the HRCE is organized in an asymmetrical fashion with a mixed variety of organizational units: elementary schools (84), junior high/middle schools (27), senior elementary (7), P-12 academy (1), junior-senior high schools (6), and senior high schools (10).   Current student enrolment figures, by school division, stand at 25,837 for Primary to Grade 6, 11,245 for Grades 7 to 9, and 10,688 for Grades 10 to 12.

Student Achievement and School Improvement

Since November of 2009, the Halifax system has been more open and transparent in reporting on student assessment results as a component of its system-wide improvement plan. Former Superintendent Carole Olsen introduced the existing accountability system along with a new mission that set a far more specific goal: “Every Student will Learn, every School will Improve.”

HRSBGoodtoGreatCollageThe Superintendent’s 2008-09 report was introduced with great fanfare with an aspirational goal of transforming “Good Schools to Great Schools” and a firm system-wide commitment that “every school, by 2013, will demonstrate improvement in student learning.” Following the release of aggregated board-wide data, the HRSB produced school-by-school accountability reports, made freely available to not only the School Advisory Councils (SACs), but to all parents in each school.

Superintendent Olsen set out what she described as “a bold vision” to create “a network of great schools” in “thriving communities” that “bring out the best in us.” School-by-school reporting was critical to that whole project. “Knowing how each school is doing is the first important step in making sure resources and support reach the schools – and the students—that need them the most,” Olsen declared.

The Established Benchmark – School Year 2008-09

The school year 2008-09, the first year in the HRSB’s system-wide improvement initiative, provided the benchmark, not only for the board, but for the AIMS research report taking stock of student achievement and school-by-school performance over the past decade.

In 2008-09, the first set of student results in the two core competencies, reading and math, demonstrated that HRSB student scores were comparable to other Canadian school systems, but there was room for improvement. In Grade 2 reading, the system-wide target was that 77 per cent of all students would meet established board standards. Only 25 out of some 91 schools (27.5 %) met or exceeded the established target.

While Grade 2 and Grade 5 Mathematics students performed better, problems surfaced at the Grade 8 level, where two out of three schools (67.5 %) failed to meet the HRSB standard. High numbers of Grade 8 students were struggling with measurement, whole number operations (multiplication, division), problem-solving, and communication.

System Leadership Change and Policy Shifts

Schools in the Halifax school system may have exceeded the initial public expectations, but the vast majority of those schools fell far short of moving from “Good Schools to Great Schools.” Some gains were made in student success rates in the two core competencies, reading and mathematics, by the 2013 target year, but not enough to match the aspirational goals set by Superintendent Olsen and the elected school board.

HRSBElwinLeRoux

With Olsen’s appointment in September 2012 as Deputy Minister of Education for Nova Scotia, the robust HRSB commitment to school-by-school improvement and demonstrably improved standards in reading and mathematics faltered. Her successor, LeRoux, a 24-year board veteran, espoused more modest goals and demonstrated a more collegial, low-key leadership style. Without comprehensive school system performance reports, the school community reports, appended routinely as PDFs to school websites, attracted little attention.

The “Good Schools to Great Schools” initiative had failed to work miracles. That became apparent in May 2014, following the release of the latest round of provincial literacy assessments.  The formal report to the Board put it bluntly: “A large achievement gap exists between overall board results and those students who live in poverty.”

School administration, based upon research conducted in-house by psychologist Karen Lemmon, identified schools in need of assistance when more than one-third of the family population in a school catchment could be classified as “low income” households. Twenty of its 84 elementary schools were identified and designated as “Priority Schools” requiring more attention, enhanced resources, and extra support programs to close the student achievement gap.

The focus changed, once again, following the release of the 2017-18 provincial results in Grade 6 Math and Literacy. Confronted with those disappointing results, the HRSB began to acknowledge that students living in poverty came disproportionately from marginalized communities.

Instead of focusing broadly on students in poverty, the Board turned its attention to the under-performance of Grade 6 students from African/black and Mi’kmaq/Indigenous communities. For students of African ancestry, for example, the Grade 6 Mathematics scores declined by 6 per cent, leaving less than half (49 per cent) meting provincial standards. What started out as a school improvement project focused on lower socioeconomic schools had evolved into one addressing differences along ethno-racial lines.

Summaries of the AIMS Research Report Findings

Stark Inequalities – High Performing and Struggling Schools

Hopeful Signs – Most Improved Schools

Summation and Recommendations – What More Can Be Done?

Putting the Findings in Context

School-by-school comparative studies run smack up against the hard realities of the socio-economic context affecting children’s lives and their school experiences.  All public schools from Pre-Primary to Grade 12 are not created equal and some enjoy advantages that far exceed others, while others, in disadvantaged communities, struggle to retain students and are unable, given the conditions, to move the needle in school improvement. So, what can be done to break the cycle?

Questions for Discussion

Comparing school-by-school performance over the past decade yields some startling results and raises a few critical questions:  Is the quality of your education largely determined by your postal code in Canadian public school systems? What are the dangers inherent in accepting the dictates of socio-economic factors with respect to student performance?  What overall strategies work best in breaking the cycle of stagnating improvement and chronic under-performance? Should school systems be investing less in internal “learning supports” and more in rebuilding school communities themselves? 

Read Full Post »

The latest student achievement results, featured in the April 30, 2018 Pan-Canadian Assessment Program (PCAP) 2016 report, prove, once again, how system-critical testing is for K-12 education. Students in every Canadian province except Ontario saw gains in Grade 8 student scores from 2010 to 2016 and we are now much the wiser. That educational reality check simply confirms that it’s no time to be jettisoning Ontario’s Grade 3 provincial tests and chipping away at the reputation of the province’s independent testing agency, the Education Quality and Accountability Office (EQAO).

The plan to end Grade 3 provincial testing arrived with the final report of Ontario: A Learning Province, produced by OISE professor Carol Campbell and her team of six supposedly independent advisors, including well-known change theorists Michael Fullan, Andy Hargreaves and Jean Clinton. Targeting of the EQAO was telegraphed in an earlier discussion paper, but the consultation phase focused ostensibly more on “broadening measures of student success” beyond achievement and into the largely uncharted realm of “social and emotional learning” (SEL).

The final report stunned many close observers in Ontario who expected much more from the review, and, in particular, an SEL framework for assessment and a new set of “student well- being” reports for the 2018-19 school year.  Tampering with Grade 3 testing made former Ontario Deputy Minister Charles Pascal uncomfortable because it interfered with diagnosis for early interventions.

It also attracted a stiff rebuke from the world’s leading authority on formative assessment, British assessment specialist Dylan Wiliam. He was not impressed at all with the Campbell review committee report. While it was billed as a student assessment review, Wiliam noted that none of the committee members is known for expertise in assessment, testing or evaluation.

Education insiders were betting that the Kathleen Wynne Liberal-friendly review team would simply unveil the plan for “broader student success” developed by Annie Kidder and her People for Education lobby group since 2012 and known as the “Measuring What Matters” project. It is now clear that something happened to disrupt the delivery of that carefully nurtured policy baby. Perhaps the impending Ontario provincial election was a factor.

Social and emotional learning is now at the very core of Ontario’s Achieving Excellence and Equity agenda and it fully embraces “supporting all students” and enabling them to achieve “a positive sense of well-being – the sense of self, identity, and belonging in the world that will help them to learn, grow and thrive.”

The Ontario model, hatched by the Education Ministry in collaboration with People for Education, is based upon a psycho-social theory that “well-being” has “four interconnected elements” critical to student development, with self/spirit at the centre. The whole formulation reflects the biases of the architects, since grit, growth mindset, respect and responsibility are nowhere to be found in the preferred set of social values inculcated in the system. Whatever the rationale, proceeding to integrate SEL into student reports and province-wide assessments is premature when recognized American experts Angela Duckworth and David Scott Yeager warn that the ‘generic skills’ are ill- defined and possibly unmeasureable.

Evidence-informed researchers such as Daisy Christodoulou, author of Making Good Progress (2017), do not support the proposed change in Ontario student assessment focus. Generic or transferable skills approaches such as Ontario is considering generate generic feedback of limited value to students in the classroom. Relying too heavily on teacher assessments is unwise because, as Christodoulou reminds us, disadvantaged students tend to fare better on larger-scale, objective tests. The proposed prose descriptors will, in all likelihood, be jargon-ridden, unintelligible to students and parents, and prove particularly inaccessible to students struggling in school.

One of the reasons Ontario has been recognized as a leading education system is because of its success over the past 20 years in establishing an independent EQAO with an established and professionally-sound provincial testing program in Grades 3, 6, and 9 and a Grade 10 literacy test that needs improvement. Legitimate teacher concerns about changes that increase marking loads do need to be addressed in any new student assessment plan and so do objections over the fuzzy, labour-intensive SEL student reports.

The proposal to phase out Ontario provincial testing may already be dead in the water.  If it is, you can guess that the April 30, 2018 editorial in The Toronto Star was definitely a contributing factor.  If the Wynne Liberals go down to defeat in the June 2018 election, the whole plan will likely be shelved or completely revamped by a new government.

Whether you support the EQAO or not, the agency has succeeded in establishing reliable quality standards for student performance in literacy and mathematics. Abandoning Grade 3 testing and gutting the EQAO is not only ill-conceived, but ill advised. Without the PCAP and provincial achievement benchmarks we would be flying blind into the future.

What can possibly be gained from eliminating system-wide Grade 3 provincial assessments?  How does that square with research suggesting early assessments are critical in addressing reading and numeracy difficulties?  Without Ontario, would it be possible to conduct comprehensive Grade 3 bench-marking across Canada?  If staff workload is the problem, then aren’t there other ways to address that matter?  And whatever happened to the proposed Social and Emotional Learning (SEL) assessments and reports? 

Read Full Post »

Ontario now aspires to global education leadership in the realm of student evaluation and reporting. The latest Ontario student assessment initiative, A Learning Province, announced in September 2017 and guided by OISE education  professor Dr. Carol Campbell, cast a wide net encompassing classroom assessments, large scale provincial tests, and national/international assessment programs.  That vision for “student-centred assessments” worked from the assumption that future assessments would capture the totality of “students’ experiences — their needs, learning, progress and well-being.”

The sheer scope whole project not only deserves much closer scrutiny, but needs to be carefully assessed for its potential impact on frontline teachers. A pithy statement by British teacher-researcher Daisy Christodoulou in January 2017 is germane to the point: “When government get their hands on anything involving the word ‘assessment’, they want it to be about high stakes monitoring and tracking, not about low-stakes diagnosis.”  In the case of  Ontario, pursuing the datafication of social-emotional-learning and the mining of data to produce personality profiles is clearly taking precedence over the creation of teacher-friendly assessment policy and practices.

One of the reasons Ontario has been recognized as a leading education system is because of its success over the past 20 years in establishing an independent Education Quality and Accountability Office  (EQAO) with an established and professionally-sound provincial testing program in Grades 3, 6, 9 and 10.  Whether you support the EQAO or not, most agree that is has succeeded in establishing reliable benchmark standards for student performance in literacy and mathematics.

The entire focus of Ontario student assessment is now changing. Heavily influenced by the Ontario People for Education Measuring What Matters project, the province is plunging ahead with Social and Emotional Learning (SEL) assessment embracing what Ben Williamson aptly describes as “stealth assessment” – a set of contested personality criteria utilizing SEL ‘datafication’ to measure “student well-being.” Proceeding to integrate SEL into student reports and province-wide assessments is also foolhardy when American experts Angela Duckworth and David Scott Yeager warn that the ‘generic skills’ are ill- defined and possibly unmeasureable.

Social and emotional learning is now at the very core of Ontario’s Achieving Excellence and Equity agenda and it fully embraces “supporting all students” and enabling them to achieve “a positive sense of well-being – the sense of self, identity, and belonging in the world that will help them to learn, grow and thrive.” The Ontario model is based upon a psycho-social theory that “well-being” has “four interconnected elements” critical to student development, with self/spirit at the centre. Promoting student well-being is about fostering learning environments exhibiting these elements:

Cognitive: Development of abilities and skills such as critical thinking, problem solving, creativity, and the ability to be flexible and innovative.

Emotional: Learning about experiencing emotions, and understanding how to recognize, manage, and cope with them.

Social: Development of self-awareness, including the sense of belonging, collaboration, relationships with others, and communication skills.

Physical: Development of the body, impacted by physical activity, sleep patterns, healthy eating, and healthy life choices.

Self/Spirit:  Recognizing the core of identity whieh has “different meanings for different people, and can include cultural heritage, language, community, religion or a broader spirituality.”

Ontario’s new student report cards, proposed for 2018-19 implementation, will incorporate an distinct SEL component with teacher evaluations on a set of “transferable skills” shifting the focus from organization and work habits to “well-being” and associated values, while retaining grades or marks for individual classes. The Ontario Education “Big Six” Transferable Skills are: critical thinking, innovation and creativity, self-directed learning, collaboration, communication, and citizenship.  Curiously absent from the Ontario list of preferred skills are those commonly found in American variations on the formula: grit, growth mindset, and character

The emerging Ontario student assessment strategy needs to be evaluated in relation to the latest research and best practice, exemplified in Dylan Wiliam’s student assessment research and Daisy Christodoulou’s 2017 book Making Good Progress: The Future of Assessment for Learning.  Viewed through that lens, the Ontario student assessment philosophy and practice falls short on a number of counts.

  1. The Generic Skills Approach: Adopting this approach reflects a fundamental misunderstanding about how students learn and acquire meaningful skills. Tacking problem-solving at the outset, utilizing Project-Based Learning to “solve-real life problems” is misguided  because knowledge and skills are better acquired  through other means. The “deliberate practice method” has proven more effective. Far more is learned when students break down skills into a ‘progression of understanding’ — acquiring the knowledge and skill to progress on to bigger problems.
  2. Generic Feedback: Generic or transferable skills prove to be unsound when used as a basis for student reporting and feedback on student progress. Skills are not taught in the abstract, so feedback has little meaning for students. Reading a story and making inferences, for example, is not a discrete skill; it is dependent upon knowledge of vocabulary and background context to achieve reading comprehension.
  3. Hidden Bias of Teacher Assessment: Teacher classroom assessments are highly desirable, but do not prove as reliable as standardized measures administered under fair and objective conditions. Disadvantaged students, based upon reliable, peer-reviewed research, do better on tests than of regular teacher assessments. “Teacher assessment is biased not because they are carried out by teachers, but because it is carried out by humans.”
  4. Unhelpful Prose Descriptors: Most verbal used in system-wide assessments and reports are unhelpful — tend to be jargon-ridden, unintelligible to students and parents, and prove particularly inaccessible to students struggling in school. Second generation descriptors are “pupil friendly” but still prove difficult to use in learning how to improve or correct errors.
  5. Work-Generating Assessments: System-wide assessments, poorly constructed, generate unplanned and unexpected marking loads, particularly in the case of qualitative assessments with rubrics or longer marking time. In the U.K., for example, the use of grade descriptors for feedback proved much more time consuming than normal grading of written work Primary teachers who spent 5 hours a week on assessment in 2010, found that, by 2013, they were spending 10 hours a week.AssessmentMarkLoadCrisisWhat’s wrong with the new Ontario Assessment Plan and needs rethinking?
  1. The Generic Skills Approach – Teaching generic skills (SEL) doesn’t work and devalues domain-specific knowledge
  2. Social and Emotional Learning (SEL) models — carry inherent biases and are unmeasurable
  3. Breach of Student Security – Data mining and student surveys generate personality data without consent
  4. Erosion of Teacher Autonomy – Student SEL data generated by algorithms, creates more record-keeping, more marking, cuts into classroom time.

The best evidence-based assessment research, applied in deconstructing the Ontario Assessment initiative, raises red flags.  Bad student assessment practices, as Wiliam and Christodoulou show, can lead to serious workload problems for classroom teachers. No education jurisdiction that lived up to the motto “Learning Province” would plow ahead when the light turns to amber.

A summary of the researchED Ontario presentation delivered April 14, 2018, at the Toronto Airport Westin Hotel. 

Where is the new Ontario student assessment initiative really heading? Is it a thinly-disguised attempt to create a counterweight to current large-scale student achievement assessments? Is it feasible to proceed with SEL assessment when leading researchers question its legitimacy and validity? Are we running the risk of opening the door to the wholesale mining of student personal information without consent and for questionable purposes? 

Read Full Post »

Millions of Facebook users were profiled by Cambridge Analytica without their knowledge and that public disclosure has heightened everyone’s awareness of not only the trend to “personality profiling,’ but the potential for massive invasion of privacy. These controversial actions have exposed the scope of Big Data and the wider aspirations of the data analytics industry to probe into the “hidden depths of people.” It has also, as U.K. expert Ben Williamson has reminded us, tipped us off about the growing trend toward personality measurement in K-12 and post-secondary education.

Williamson’s 2017 book, Big Data in Education, sounded the alert that the collection and analysis of more personal information from schoolchildren will be a defining feature of education in coming years. And just as the Facebook debacle raises public concerns about the use of personal data, a new international test of ten and 15-year-olds is to be introduced by the Organization for Economic Cooperation and Development (OECD) – a powerful influence on national education policies at a global scale.  Almost without being detected, it is also emerging as a key component of the current Ontario Student “Well-Being” Assessment, initially piloted from 2014 to 2016 by Ontario People for Education as the core objective of its Measuring What Matters project.

Most data collected about students since the 1990s has came from conventional international, national and provincial examinations of knowledge and cognitive skills. Preparing students for success in the 21st century workplace has been a major driver of most initiatives in testing and accountability.  International test results such as OECD’s Program of International Student Assessment (PISA) have also become surrogate measures of the future economic potential of nations, feeding a global education race among national education systems.

The advent of Big Data is gradually transforming the nature of student assessment. While the initial phase was focused on stimulating competitive instincts and striving for excellence, more recent initiatives are seeking to “broaden the focus of student assessment” to include what is termed “social and emotional learning (SEL).” Much of the motivation is to secure some economic advantage, but that is now being more broadly defined to help mould students committed to more than individual competitiveness.  With the capacity to collect more “intimate” data about social and emotional skills to measure personality, education policymakers are devising curriculum and assessment programmes to improve personality scores. Despite the Cambridge Analytica controversy, personality data is well on the way to being used in education to achieve a variety of competing political objectives.

The ‘Big Five’ of Personality Profiling

The science of the psychographic profiling employed by Cambridge Analytica is hotly contested. It is, however, based on psychological methods that have a long history for measuring and categorizing people by personality. At its core is a psychological model called the “five factor model” of personality – or the “Big Five.” These include “openness”, “conscientiousness”, “extroversion”, “agreeableness” and “neuroticism” (OCEAN). Personality theorists believe these categories are suitable for classifying the full range of human personalities. Psychologists have invented instruments such as the so-called ‘Big Five Inventory’  to capture OCEAN data for personality modelling.

Advent of Stealth Assessment

The upcoming 2018 OECD PISA test will include, for the first time, a battery of questions aimed at assessing “global competencies” with a distinct SEL orientation. In 2019, the OECD plans to launch its international Study of Social and Emotional Learning  Designed as a computer-based self-completion questionnaire, at its core the test is a modified version of the Big Five Inventory. The OECD version maps exactly onto the five factor personality categories with “emotional stability” substituted in place of “neuroticism.” When implemented, the social and emotional skills test will assess students against each of the Big Five categories.

The OECD Education Skills experts, working in collaboration with Pearson International, firmly believe that social and emotional skills are important predictors of educational progress and future workplace performance. Large-scale personality data is clearly seen by the OECD to be predictive of a country’s potential social and economic progress. Although both the OECD and the Ontario Student Well-Being advocates both claim that it is strictly a test of social and emotional skills, Williamson claims such projects employ the same family of methods used in the Cambridge Analytica personality quiz. Upon closer examination, the same psychological assumptions and personality assessment methods underpin most of the latest education ventures.

The OECD is already a powerful influence on the moulding of national education policies. Its PISA testing has reshaped school curricula, assessments and whole systems in the global education race.  It is increasingly likely that its emphasis on personality testing will, once again, reshape education policy and school practices. Just as PISA has influenced a global market in products to support the core skills of literacy, numeracy and science tested by the assessment, the same is now occurring around SEL and personality development.  Canada’s provincial and territorial ministers of education, working under the auspices of the Council of Ministers of Education, Canada (CMEC) have not only endorsed the OECD’s  proposed “global competencies,” but proposed a variation of their own to guide assessment policy.

The Ontario Student Assessment initiative, announced September 6, 2017, deserves closer scrutiny through the lens of datafication and personality profiling. It’s overarching goal bears repeating: “Update provincial assessment and reporting practices, including EQAO, to make sure they are culturally relevant, measure a wider range of learning, and better reflect student well-being and equity.”  Founder of People for Education Annie Kidder hailed the plan for “embedding” the “transferable skills” and positioning Ontario to take “a leading role in the global movement toward broader goals for education and broader measures of success in our schools.”

Critics of large-scale student assessments are quick to identify the underlying influence of “globalization” and the oft-stated goal  of preparing students for the highly competitive “21st century workplace.”  It can be harder to spot currents moving in the opposite direction and heavily influenced by what Kathryn Ecclestone and Denis Hayes aptly termed the “therapeutic education ethos.” Ten years ago, they flagged the rise of  a “therapeutic education” movement exemplified by classroom activities and programs, often branded as promoting ‘mindfulness,’ which pave the way for “coaching appropriate emotions” and transform education into a disguised form of “social engineering” aimed at producing “emotionally literate citizens” who are notably “happy” and experience “emotional well-being.”

Preparing students to be highly competitive human beings or to be creative and cooperative individuals is risking re-framing public education in terms of personality modification, driven by ideological motivations, rather than the pursuit of meaningful knowledge and understanding. It treats children as ‘guinea pigs’ engaged in either market competition preparation or social engineering, and may well stand in the way of classroom teachers pursuing their own evidence-based, knowledge-centred curriculum aims.

Appropriating and misusing personality data by Facebook and Cambridge Analytica led to a significant world-wide public backlash. In education, however, tests and technologies to measure student personality, according to Williamson, are passing unchallenged. It is equally controversial to capture and mine students’ personality data with the goal of shaping students to “fit into” the evolving global marketplace.  Stealth assessment has arrived and being forewarned is forearmed.

Why is education embracing data mining and personality profiling for schoolchildren? What are the connections between Facebook data mining and recent social-and-emotional learning assessment initiatives?  Should students and parents be advised, in advance, when student data is being minded and mapped against personality types?  Why have Canadian assessment projects like the Ontario Measuring What Matters- Student Well-Being initiative escaped close scrutiny?  Should we be more vigilant in tracking and monitoring the use and abuse of Big Data in education? 

Read Full Post »

Older Posts »