The Higher Education Quality Council (HEQCO) recently proclaimed that one-quarter of graduating students score below adequate on measures of literacy and numeracy. This was quickly mimicked in a Globe and Mail headline: One in four Ontario postsecondary students lacks basic literacy, numeracy skills, studies say. HEQCO came to the conclusion based on the results of their pilot study using OECD’s Education and Skills Online (ESO) assessment.
One of the main problems with the conclusion and the overall aim of the project—to ensure postsecondary institutions are producing productive workers—is that the determination of adequacy and minimal proficiency is based on a lie.
In their summary Harvey Weingarten, the president and CEO, and his team write the following:
HEQCO has identified Level 3 as the minimum required proficiency level for Ontario’s higher education graduates.
The project’s aims are also hooked into the very dangerous conceptualization of social worth and the move to categorize citizens based on their productivity and contribution to the economy. Weingarten and associates state (Measuring Essential Skills of Postsecondary Students: Final Report of the Essential Adult Skills Initiative, 2018, p. 6):
Anything less [than Level 3] carries too much risk of drag on both individuals’ and the economy’s performance .
However ESO, the commercial spin-off of a test used in a 30-year international large-scale assessment project managed by the OECD, can’t be used to make such proclamations.
False claims were perpetuated within the context of the international testing project during its first two rounds of testing in the 1990s and 2000s when the OECD was partnered with Statistics Canada. When the partnership ended and the OECD took over the project, the manager, William Thorn, made an unequivocal statement about the Level 3 designation of adequacy. (I wrote about this here.)
We’re making no claim that Level 3 is essential to managing in modern life. That’s manifestly false…That kind of description is a supplementary interpretation which has been put on these levels, which I don’t think is justified. And I think that’s a view that is shared by many other people as well (Centre for Literacy, 2013).
One of the “many others” who share this view is the test designer himself, Irwin Kirsch, who has also stated:
[T]hese data do not reveal the types of literacy demands that are associated with particular contexts in this pluralistic society. That is, they do not enable us to say what specific level of prose, document, or quantitative skill is required to obtain, hold, or advance in a particular occupation, to manage a household, or to obtain legal or community services, for example (NCES technical report , 2000, p. 9).
In other words, a Level 3 result or any result cannot be used to predict actual performance in people’s lives. There simply isn’t enough empirical evidence. Attempts to make the connection, that literacy on its own can open up opportunities regardless of broader social circumstances and economic conditions, are proving to be highly complex and contingent.
Our literacy abilities are a consequence of our engagement in literacy-rich and demanding environments, like school. Most people are able to meet the demands they encounter. As a consequence, their tested skills can increase over time. Conversely, when working or living in impoverished literacy environments, people’s skills can deteriorate.
Compounding the questionable conclusions is a perplexing research design, in which first year students were assessed and compared to a different group of graduating students (a total of 4,630 students from 20 institutions participated in the ESO pilot). The sample was not representative and not randomized. Although the report writers emphasize that this was a pilot study, there are no reminders that their conclusions are not generalizable. In addition, the testing session was not monitored and students could logon or off at will during a test window lasting a few weeks.
The analysis also ignores a puzzling drop in the literacy scores between the first year and graduating college students. Such a drop is a red flag that may indicate an issue with the study design or even point to a limitation of the test tool in certain circumstances (over half of college students were in a two-year diploma program).
Also not considered are the consistently higher scores on ESO compared to PIAAC. The two tests are intended to be directly comparable.
HEQCO though, with funding for the project from both the provincial and federal governments, felt the pilot was such a success that they plan to work on the implementation of regular testing of “transferable skills” in Ontario’s colleges and universities.
They argue that postsecondary institutions should explicitly teach these transferable skills, perhaps like the ones used in testing, in addition to teaching disciplinary knowledge. This line of thinking fails to grasp that the test is picking up on those very disciplinary abilities. Explicitly teaching the generic skills used in testing may increase test scores in the short-term, but likely won’t help with disciplinary learning. It could also have a perverse effect on postsecondary learning as time is devoted to generic abilities. And for what? A generic approach doesn’t imply transfer of anything except the generic abilities of testing. It wouldn’t help students navigate the myriad of literacy environments—some highly demanding and others requiring little from the students’ literacy repertoires—that they enter once they graduate.
Policymakers within the Ministry of Training, Colleges and Universities (MTCU), which funds HEQCO, are chasing a mirage if they think that Level 3 predicts actual outcomes and job performance. Even if the “manifestly false” claim no longer coordinated ministry thinking, policy folks still have to contend with another misunderstanding—that the skills needed to achieve this and any level within ESO/PIAAC are readily learnable and transferable.
Literacy and Basic Skills (LBS) has been encumbered by this thinking and subsequent policy arrangements since 2012, providing some compelling insights into the way that gross misunderstandings of literacy development, and policy designed around a false claim, can lead to systemic inequities and a series of contradictions, confusions and compromises.
This post was first published in Policy Problems.