Feeds:
Posts
Comments

Posts Tagged ‘PISA Results’

The Homework Debate never seems to go away.  Popular books and articles inspired initially by American education writer Alfie Kohn and his Canadian disciples continue to beat the drum for easing the homework burden on students or eliminating homework altogether before the secondary school level. That “No Homework” movement made significant inroads in the United States and Canada during the 2000’s. The Organization for Economic Cooperation and Development (OECD), responsible for the Program of International Assessment (PISA) test, confirmed that the amount of time students in North America spend on doing homework had declined, as of the 2014 assessment year.

HomeworkHackItHomeworkCaseAgainst2006

 

A critical question needs to be asked: Has the “No Homework” movement and the apparent push-back against homework had an adverse effect on student achievement? That’s difficult to answer because, despite the critical importance of the issue and the long history of homework research, few North American researchers have been inclined to study the role that homework plays in enhancing student achievement, even in mathematics.

One little-known researcher, Lake B. Yeworiew, an Ethiopian scholar, based at the University of Calgary, but recently-arrived in Canada, saw the hole in the research and recently tackled the whole question. His focus was on assessing the relationship between homework and Grade 8 Mathematics student achievement, comparing Canadian students with the top performing students in the world. While attending the AERA 2019 Congress (April 5-9) in Toronto, I got a sneak peak at his findings.  While his research study attracted little attention, it will be of considerable interest to all of those committed to maintaining and improving student performance standards.

LakeYoworiew

His University of Calgary study, co-authored with Man-Wai Chu and Yue Xu, laid out the essential facts: The average performance of Canadian students in Mathematics (PISA) has declined since 2006 (OECD, 2007, 2010, 2014, 2016)  Students from three top performing Asian countries, Singapore, Macau-China and Japan, continue to outperform our 15-year-old students by a significant margin.  Furthermore, OECD reports that students in Asian countries (Singapore, Japan, Macao- China and Hong Kong-China) spend more time doing homework and score much higher. It is estimated that they score 17 points or more per extra hour of homework.

Recent North American research seems more alert to the need to study the relationship between homework and academic achievement, particularly in mathematics. A literature review, conducted by Yeworiew, Chu and Xu, demonstrates that, while the findings cut in both directions, the weight of research favours homework. In fact, the Canadian Council of Ministers’ of Education (CMEC 2014) has come down in favour of homework. Based upon Canadian national test surveys (PCAP), CMEC confirms that math achievement of students who do not do homework is significantly lower than those doing regular homework.

Yeworiew and his research team provide further confirmation of this 2014 CMEC assessment. Utilizing the 2015 TIMSS study in Canada, involving 8,757 students and 276 schools in four provinces (Ontario, Quebec, Manitoba and Newfoundland/Labrador), the authors demonstrate the clear value of regular homework in modest amounts.

The research findings are effectively presented in a series of graphs mapping the study results, reprinted here directly from their AERA 2019 Toronto presentation:

 

 

The relationship between homework and achievement is becoming less of a mystery. Based upon the performance of Grade 8 students in the 2015 TIMSS study, short but frequent homework assignments contribute to improved student learning and achievement in mathematics. Frequent homework assignments, up to four times a week, have a positive effect on math achievement, but less sop when it is of longer duration. No discernable differences were detected for girls in relation to boys at the Grade 8 level in Canada.

Why do Canadian researchers produce so few studies like the University of Calgary project attempting to assess the impact of homework on achievement?  To what extent is it because Canadian homework studies tend to focus on psycho-social aspects such as the impact of homework on student attitudes and the opinions of parents?

Are we asking the right questions? “How much is enough?” is surely a sounder line of inquiry than “How do you feel when overburdened with homework? ” What is really accomplished by asking ‘Does homework ad to your anxieties?” Should we be more conscious of the inherent biases in such research questions? 

 

 

 

 

 

 

 

 

Advertisements

Read Full Post »

“Canadians can be proud of our showing in the 2015 Programme for International Student Assessment (PISA) report,” declared Science consultant Bonnie Schmidt and former Council of Ministers of Education (CMEC) director Andrew Parkin in their first-off-the mark December 6, 2016 response to the results. “We are, ” they added, “one of only a handful of countries that places in the top tier of the Oganization for Economic Development and Cooperation (OECD) in each of the three subjects tested:science, reading and math.”

pisa2015cmeccover“Canada” and “Canadian students,” we were told, were once again riding high in the once-every-three-years international test sweepstakes. If that that effusively positive response had a familiar ring, it was because it followed the official line advanced by a markedly similar CMEC media release, issued a few hours before the commentary.

Since our students, all students in each of our ten provincial school systems, were “excelling,” then it was time for a little national back-slapping. There’s one problem with that blanket analysis: it serves to maintain the status quo, engender complacency, obscure the critical Mathematics scores, and disguise the lopsided nature of student performance from region to region.

Hold on, not so fast, CMEC — the devil is in the real details and more clearly portrayed in the OECD’s own “Country Profile” for Canada. Yes, 15-year-olds in three Canadian provinces (Alberta, British Columbia, and Quebec) achieved some excellent results, but overall Mathematics scores were down, and students in over half of our provinces trailed-off into mediocrity in terms of performance. Our real success was not in performance, but rather in reducing the achievement gap adversely affecting disadvantaged students.

Over half a million 15-year-olds in more than 72 jurisdictions all over the world completed PISA tests, and Schmidt and Parkin were not alone in making sweeping pronouncements about why Canada and other countries are up and others down in the global rankings.

Talking in aggregate terms about the PISA performance of 20,000 Canadian students in ten different provinces can be, and is, misleading, when the performance results in mathematics continue to lag, Ontario students continue to underperform, and students in two provinces, Manitoba and Saskatchewan, struggle in science, reading, and mathematics.  Explaining all that away is what breeds complacency in the school system.

My own PISA 2015 forecast was way off-base — and taught me a lesson.  After the recent TIMSS 2015 Mathematics results released in November 2016, an  East Asian sweep, led by Singapore and Korea, seemed like a safe bet. How Finland performs also attracts far less attention than it did in its halcyon days back in 2003 and 2006. The significant OECD pivot away from excellence to equity caught me napping and I completely missed the significance of moving (2012 to 2015) from pencil-and-paper to computer-based tests. 

Some solace can be found in the erroneous forcecasts of others. The  recent Alberta Teachers’ Association (ATA) “Brace Yourself” memo with its critique of standardized testing assessment, seemed to forecast a calamitous drop in Alberta student performance levels. It only happened in Mathematics.

Advocates of the ‘Well-Being’ curriculum and broader assessment measures, championed by Toronto’s People for Education, will likely be temporarily thrown off-stride by the OECD’s new-found commitment to assessing equity in education. It will be harder now to paint PISA as evil and to discredit PISA results based upon such a narrow range of skills in reading, math and science.

The OECD’s “Country Profile” of Canada is worth studying carefully because it aggregates data from 2003 to 2015, clarifies the trends, and shows how Canadian students continue to struggle in mathematics far more than in reading and science.

Canadian students may have finished 12th in Mathematics with a 516 aggregate score, but the trend line continues to be in decline, down from 532 in 2003. Digging deeper, we see that students in only two provinces, Quebec ( 544) and BC (522) actually exceeded the national mean score. Canada’s former leader in Mathematics performance, Alberta, continued its downward spiral from the lofty heights of 549 (2003) to 511 (2015).

Since Ontario students’ provincial mathematics scores are declining, experts will be pouring over the latest PISA results to see how bad it is in relation to the world’s top performing systems. No surprises here: Ontario students scored 509, finishing 4th in Canada, and down from 530 on PISA 2003. Excellence will require a significant change in direction.

The biggest discovery in post-2015 PISA analysis was the positive link between explicit instruction and higher achievement in the 2015 core assessment subject, science. The most important factor linked with high performance remains SES (soci0-economic status), but teacher-guided instruction was weighted close behind and students taught with minimal direction, in inquiry or project-based classes, simply performed less well on the global test.

The results of the 15-year-olds are largely determined over 10 years of schooling, and not necessarily the direct consequence of the latest curriculum fad such as “discovery math.’’

It’s better to look deeper into what this cohort of students were learning when they first entered the school system, in the mid-1990s. In the case of Canadian students, for example, student-centred learning was at its height, and the country was just awakening to the value of testing to determine what students were actually learning in class.

Where the student results are outstanding, such as Singapore and Estonia, it is not solely attributable to the excellence of teaching or the rigour of the math and science curriculum.

We know from the “tutoring explosion” in Canada’s major cities that the prevalence of private tuition classes after school is a contributing factor, and may explain the current advantage still enjoyed in mathematics by Pacific Rim students.

Children of Chinese heritage in Australia actually outperformed students in Shanghai on the 2012 PISA test, and we need to explore whether that may be true for their counterparts in Greater Vancouver. The so-called “Shanghai Effect” may be attributed as much to “tiger mothers” as it is to the quality of classroom instruction.

Whether Canada and Canadians continue to exhibit high PISA self-esteem or have simply plateaued does not matter as much as what we glean over the next few years from studying best international practice in teaching, learning, and assessment.

Surveying PISA student results, this much is clear: standing still is not an option in view of the profound changes that are taking place in life, work, and society.

 

Read Full Post »