PISA and quality education

PISA and quality education

(First of two parts)

Almost everyone agrees that low quality is the biggest problem of Philippine education. The quality indicator that has captured the attention of education analysts, observers and advocates is the country’s performance in the Programme for International Student Assessment or PISA, which the Philippines participated in for the first time in 2018 and sustained in the 2022 round. Administered by the Organization for Economic Co-operation and Development (OECD), the PISA evokes significant public reaction because of its international character and the ranking of country scores.

But not many are aware of how the PISA is done and how the OECD itself interprets the results.

The PISA tests the proficiency of a representative sample of a country’s 15-year-old learners in reading, science, and mathematics. The scores are set in relation to the results observed across all participants, scaled to fit approximately normal distributions (means around 500 score points, and standard deviations around 100 score points). The PISA scales are then divided into proficiency levels ranged according to the degree of difficulty of the test items.

Mathematics and reading have eight levels, with level 1c corresponding to the simplest items, and the remaining seven levels (1b, 1a, 2, 3, 4, 5, and 6) corresponding to increasingly difficult items. For science, there are seven proficiency levels, namely, 1b, 1a, 2, 3, 4, 5, and 6.

PISA characterizes or describes the tasks that the student can do at the proficiency level corresponding to his or her score. The PISA has set a baseline score of Level 2, which, for reading and mathematics, is a monitoring tool for the minimum proficiency level under the United Nations Sustainable Development Goals (SDG) targets.

In the 2022 PISA, the Philippines tallied an average of 355 score points in mathematics (the lower range cut-off for level 2 is 420 points score), 347 in reading (the lower range cut-off for level 2 is 407 points score), and 356 in science (the lower range cut-off for level 2 is 410 points score). Our aggregate scores in all three subjects rank us 77th among 80 participating countries and economies.

The Philippine scores correspond to proficiency level 1b for mathematics (three score points shy of the cut-off to reach level 1a), level 1a for reading, and level 1a for science. Contrary to the perception that our PISA performance means that our 15-year-old learners do not know how to read nor count, what the Philippine average scores indicate is that our learners are only able to demonstrate low levels of knowledge and skills in these subjects.

Specifically, level 1b in mathematics indicates that the student “can respond to questions involving easy to understand contexts where all information needed is clearly given in a simple representation (i.e., tabular or graphic) and, as necessary, recognize when some information is extraneous and can be ignored with respect to the specific question being asked. They are able to perform simple calculations with whole numbers, which follow from clearly prescribed instructions, defined in short, syntactically simple text.” In contrast, the baseline minimum of norm of level 2 indicates that “students can recognize situations where they need to design simple strategies to solve problems, including running straightforward simulations involving one variable as part of their solution strategy. They can extract relevant information from one or more sources that use slightly more complex modes of representation, such as two-way tables, charts, or two-dimensional representations of three-dimensional objects. Students at this level demonstrate a basic understanding of functional relationships and can solve problems involving simple ratios. They are capable of making literal interpretations of results.”

For reading, level 1a indicates that the learner “can understand the literal meaning of sentences or short passages. Readers at this level can also recognize the main theme or the author’s purpose in a piece of text about a familiar topic, and make a simple connection between several adjacent pieces of information, or between the given information and their own prior knowledge. They can select a relevant page from a small set based on simple prompts, and locate one or more independent pieces of information within short texts. Level 1a readers can reflect on the overall purpose and on the relative importance of information (e.g., the main idea vs. non-essential detail) in simple texts containing explicit cues. Most tasks at this level contain explicit cues regarding what needs to be done, how to do it, and where in the text(s) readers should focus their attention.”

In contrast, the baseline minimum norm of level 2 indicates that the learner “can identify the main idea in a piece of text of moderate length. They can understand relationships or construe meaning within a limited part of the text when the information is not prominent by producing basic inferences, and/or when the text(s) include some distracting information. They can select and access a page in a set based on explicit though sometimes complex prompts, and locate one or more pieces of information based on multiple, partly implicit criteria. Readers at Level 2 can, when explicitly cued, reflect on the overall purpose, or on the purpose of specific details, in texts of moderate length. They can reflect on simple visual or typographical features. They can compare claims and evaluate the reasons supporting them based on short, explicit statements. Tasks at Level 2 may involve comparisons or contrasts based on a single feature in the text. Typical reflective tasks at this level require readers to make a comparison or several connections between the text and outside knowledge by drawing on personal experience and attitudes.”

For science, level 1a indicates that a learner is “able to use basic or everyday content and procedural knowledge to recognize or identify explanations of simple scientific phenomena. With support, they can undertake structured scientific enquiries with no more than two variables. They are able to identify simple causal or correlational relationships and interpret graphical and visual data that require a low level of cognitive demand. Level 1a students can select the best scientific explanation for given data in familiar personal, local, and global contexts.”

At level 2, “students are able to draw on everyday content knowledge and basic procedural knowledge to identify an appropriate scientific explanation, interpret data and identify the question being addressed in a simple experimental design. They can use basic or everyday scientific knowledge to identify a valid conclusion from a simple data set. Level 2 students demonstrate basic epistemic knowledge by being able to identify questions that can be investigated scientifically.”

Given the country’s average point scores falling below the baseline minimum proficiency of level 2, and its low ranking among the participating countries, the apparent consensus is that we have an education crisis.

However, no matter how we describe the intensity and gravity of the problem in education, what the discourse must ask is what explains the problem, which in turn can lead us to the appropriate solutions to improve education outcomes quantitatively and qualitatively. The OECD report itself provides an answer: PISA performance is strongly correlated with the country’s economic status. As expounded in the OECD report on the 2022 PISA results, “the economic and social conditions of different countries/economies, which are often beyond the control of education policy makers and educators, can influence student performance.”

It adds that “some 62% of the difference in countries’/economies’ mean scores is related to per capita GDP.” The country’s GDP per capita in 2021 international dollar converted using PPPs is the fifth lowest among the 80 participating countries and economies. Of the participating ASEAN countries, our GDP per capita is higher only than Cambodia, which also scored lower than the Philippines in all subjects. Vietnam’s per capita GDP is 31% higher than ours, Indonesia’s is 46% higher, Thailand’s is 111% higher, and Malaysia’s is 225% higher.

While these countries scored higher than the Philippines in the PISA, both Indonesia’s and Thailand’s national averages also did not meet the level 2 baseline minimum proficiency, while Malaysia reached it only for science. It was only Vietnam that surpassed level 2 for all subjects.

We want to compare our PISA performance with Singapore which topped the country averages at 575 score points for Mathematics (level 4), 561 for Science (level 4), and 543 for reading (level 3)? Well, Singapore’s per capita GDP is higher than ours by 1,200%, and it has in fact the highest GDP per capita among all participating countries.

Thus, it will not be misplaced to say that our ability to address the challenge of education quality will depend a great deal on the ability of both the public and private sectors, specifically the President and his Cabinet, Congress and politicians, the economic managers and technocrats, and the businessmen and investors to dramatically improve our economic performance.

That said, other variables other than high economic growth explain better than average education outcomes. Improvements in education quality must supplement economic performance. Indeed, there are countries and economies that perform better than their country’s economic status would otherwise predict. Within participating ASEAN countries, Vietnam and Singapore are examples. As emphasized in the OECD report, “some 31% of differences in student performance are due to differences in countries’ education systems — mainly in how they are organized, financed, and use their resources.”

It goes without saying that the DepEd is likewise accountable in assuming the great tasks and responsibilities to improve the quality of Philippine education.

(To be continued.)

 

Nepomuceno Malaluan is a founding trustee of Action for Economic Reforms and a former DepEd undersecretary.