Programme for International Student Assessment

From Wikipedia, the free encyclopedia
  (Redirected from PISA)
Jump to: navigation, search
"PISA" redirects here. For other uses, see Pisa (disambiguation).
Programme for International Student Assessment
Abbreviation PISA
Formation 1997
Purpose Comparison of education attainment across the world
Headquarters OECD Headquarters
Location
Region served
World
Membership
59 government education departments
Head of the Early Childhood and Schools Division
Michael Davidson
Main organ
PISA Governing Body (Chair – Lorna Bertrand, England)
Parent organization
OECD
Website PISA

The Programme for International Student Assessment (PISA) is a worldwide study by the Organisation for Economic Co-operation and Development (OECD) in member and non-member nations of 15-year-old school pupils' scholastic performance on mathematics, science, and reading. It was first performed in 2000 and then repeated every three years. Its aim is to provide comparable data with a view to enabling countries to improve their education policies and outcomes. It measures problem solving and cognition in daily life.[1]

The 2012 version of the test involved 34 OECD countries and 31 partner countries, with a total of 510,000 participating students.[2] The 2015 version of the test was published on 6 December 2016.[3]

The Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS) by the International Association for the Evaluation of Educational Achievement are similar studies.

Influence and impact[edit]

PISA, and similar international standardised assessments of educational attainment are increasingly used in the process of education policymaking at both national and international levels.[4]

PISA was conceived to set in a wider context the information provided by national monitoring of education system performance through regular assessments within a common, internationally agreed framework; by investigating relationships between student learning and other factors they can “offer insights into sources of variation in performances within and between countries”.[5]

Until the 1990s, few European countries used national tests. In the 1990s, ten countries / regions introduced standardised assessment, and since the early 2000s, ten more followed suit. By 2009, only five education systems had no national student assessments.[4]

The impact of these international standardised assessments in the field of educational policy has been significant, in terms of the creation of new knowledge, changes in assessment policy, and external influence over national educational policy more broadly.

Creation of new knowledge[edit]

Data from international standardised assessments can be useful in research on causal factors within or across education systems.[4] Mons notes that the databases generated by large-scale international assessments have made possible the carrying out, on an unprecedented scale, of inventories and comparisons of education systems in more than 40 countries and on themes ranging from the conditions for learning in mathematics and reading, to institutional autonomy and admissions policies.[6] They allow typologies to be developed that can be used for comparative statistical analyses of education performance indicators, thereby identifying the consequences of different policy choices. They have generated new knowledge about education: PISA findings have challenged deeply embedded educational practices, such as the early tracking of students into vocational or academic pathways.[7]

Barroso and de Carvalho find that PISA provides a common reference connecting academic research in education and the political realm of public policy, operating as a mediator between different strands of knowledge from the realm of education and public policy.[8] However, although the key findings from comparative assessments are widely shared in the research community[4] the knowledge they create does not necessarily fit with government reform agendas; this leads to some inappropriate uses of assessment data.

Changes in national assessment policy[edit]

Emerging research suggests that international standardised assessments are impacting upon national assessment policy and practice. PISA is being integrated in national policies and practices on assessment, evaluation, curriculum standards and performance targets; its assessment frameworks and instruments are being used as best-practice models for improving national assessments; many countries have explicitly incorporated and emphasise PISA-like competencies in revised national standards and curricula; others use PISA data to complement national data and validate national results against an international benchmark.[7]

External influence over national educational policy[edit]

More important than its influence on countries' policy of student assessment, is the range of ways in which PISA is influencing countries education policy choices.

Policy-makers in most participating countries see PISA as an important indicator of system performance; PISA reports can define policy problems and set the agenda for national policy debate; policymakers seem to accept PISA as a valid and reliable instrument for internationally benchmarking system performance and changes over time; most countries - irrespective of whether they performed above, at, or below the average PISA score - have begun policy reforms in response to PISA reports.[7]

Against this, it should be noted that impact on national education systems varies markedly. For example, in Germany, the results of the first PISA assessment caused the so-called ‘PISA shock’: a questioning of previously accepted educational policies; in a state marked by jealously guarded regional policy differences, it led ultimately to an agreement by all Länder to introduce common national standards and even an institutionalised structure to ensure that they were observed.[9] In Hungary, by comparison, which shared similar conditions to Germany, PISA results have not led to significant changes in educational policy.[10]

Because many countries have set national performance targets based on their relative rank or absolute PISA score, PISA assessments have increased the influence of their (non-elected) commissioning body, the OECD, as an international education monitor and policy actor, which implies an important degree of ‘policy transfer’ from the international to the national level; PISA in particular is having “an influential normative effect on the direction of national education policies”.[7] Thus, it is argued that the use of international standardised assessments has led to a shift towards international, external accountability for national system performance; Rey contends that PISA surveys, portrayed as objective, third-party diagnoses of education systems, actually serve to promote specific orientations on educational issues.[4]

National policy actors refer to high-performing PISA countries to “help legitimise and justify their intended reform agenda within contested national policy debates”.[11] PISA data can be are “used to fuel long-standing debates around pre-existing conflicts or rivalries between different policy options, such as in the French Community of Belgium”.[12] In such instances, PISA assessment data are used selectively: in public discourse governments often only use superficial features of PISA surveys such as country rankings and not the more detailed analyses. Rey (2010:145, citing Greger, 2008) notes that often the real results of PISA assessments are ignored as policymakers selectively refer to data in order to legitimise policies introduced for other reasons.[13]

In addition, PISA's international comparisons can be used to justify reforms with which the data themselves have no connection; in Portugal, for example, PISA data were used to justify new arrangements for teacher assessment (based on inferences that were not justified by the assessments and data themselves); they also fed the government’s discourse about the issue of pupils repeating a year, (which, according to research, fails to improve student results).[14] In Finland, the country's PISA results (that are in other countries deemed to be excellent) were used by Ministers to promote new policies for ‘gifted’ students.[15] Such uses and interpretations often assume causal relationships that cannot legitimately be based upon PISA data which would normally require fuller investigation through qualitative in-depth studies and longitudinal surveys based on mixed quantitative and qualitative methods,[16] which politicians are often reluctant to fund.

Recent decades have witnessed an expansion in the uses to which PISA and similar assessments are put, from assessing students’ learning, to connecting “the educational realm (their traditional remit) with the political realm”.[17] This raises the question whether PISA data are sufficiently robust to bear the weight of the major policy decisions that are being based upon them, for, according to Breakspear, PISA data have “come to increasingly shape, define and evaluate the key goals of the national / federal education system”.[7] This implies that those who set the PISA tests – e.g. in choosing the content to be assessed and not assessed – are in a position of considerable power to set the terms of the education debate, and to orient educational reform in many countries around the globe.[7]

Framework[edit]

PISA stands in a tradition of international school studies, undertaken since the late 1950s by the International Association for the Evaluation of Educational Achievement (IEA). Much of PISA's methodology follows the example of the Trends in International Mathematics and Science Study (TIMSS, started in 1995), which in turn was much influenced by the U.S. National Assessment of Educational Progress (NAEP). The reading component of PISA is inspired by the IEA's Progress in International Reading Literacy Study (PIRLS).

PISA aims at testing literacy in three competence fields: reading, mathematics, science on a 1000-point scale.[18]

The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them (curriculum attainment). PISA claims to measure education's application to real-life problems and lifelong learning (workforce knowledge).

In the reading test, "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling." Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts."[19]

Implementation[edit]

PISA is sponsored, governed, and coordinated by the OECD, but paid for by participating countries.

Method of testing[edit]

Sampling[edit]

The students tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006, however, several countries also used a grade-based sample of students. This made it possible to study how age and school year interact.

To fulfill OECD requirements, each country must draw a sample of at least 5,000 students. In small countries like Iceland and Luxembourg, where there are fewer than 5,000 students per year, an entire age cohort is tested. Some countries used much larger samples than required to allow comparisons between regions.

Test[edit]

PISA test documents on a school table (Neues Gymnasium, Oldenburg, Germany, 2006)

Each student takes a two-hour handwritten test. Part of the test is multiple-choice and part involves fuller answers. There are six and a half hours of assessment material, but each student is not tested on all the parts. Following the cognitive test, participating students spend nearly one more hour answering a questionnaire on their background including learning habits, motivation, and family. School directors fill in a questionnaire describing school demographics, funding, etc. In 2012 the participants were, for the first time in the history of large-scale testing and assessments, offered a new type of problem, i.e. interactive (complex) problems requiring exploration of a novel virtual device.[20][21]

In selected countries, PISA started experimentation with computer adaptive testing.

National add-ons[edit]

Countries are allowed to combine PISA with complementary national tests.

Germany does this in a very extensive way: On the day following the international test, students take a national test called PISA-E (E=Ergänzung=complement). Test items of PISA-E are closer to TIMSS than to PISA. While only about 5,000 German students participate in the international and the national test, another 45,000 take only the latter. This large sample is needed to allow an analysis by federal states. Following a clash about the interpretation of 2006 results, the OECD warned Germany that it might withdraw the right to use the "PISA" label for national tests.[22]

Data scaling[edit]

From the beginning, PISA has been designed with one particular method of data analysis in mind. Since students work on different test booklets, raw scores must be 'scaled' to allow meaningful comparisons. Scores are thus scaled so that the OECD average in each domain (mathematics, reading and science) is 500 and the standard deviation is 100.[23] This is true only for the initial PISA cycle when the scale was first introduced, though, subsequent cycles are linked to the previous cycles through IRT scale linking methods.[24]

This generation of proficiency estimates is done using a latent regression extension of the Rasch model, a model of item response theory (IRT), also known as conditioning model or population model. The proficiency estimates are provided in the form of so-called plausible values, which allow unbiased estimates of differences between groups. The latent regression, together with the use of a Gaussian prior probability distribution of student competencies allows estimation of the proficiency distributions of groups of participating students.[25] The scaling and conditioning procedures are described in nearly identical terms in the Technical Reports of PISA 2000, 2003, 2006. NAEP and TIMSS use similar scaling methods.

Results[edit]

All PISA results are tabulated by country; recent PISA cycles have separate provincial or regional results for some countries. Most public attention concentrates on just one outcome: the mean scores of countries and their rankings of countries against one another. In the official reports, however, country-by-country rankings are given not as simple league tables but as cross tables indicating for each pair of countries whether or not mean score differences are statistically significant (unlikely to be due to random fluctuations in student sampling or in item functioning). In favorable cases, a difference of 9 points is sufficient to be considered significant.[citation needed]

PISA never combines mathematics, science and reading domain scores into an overall score. However, commentators have sometimes combined test results from all three domains into an overall country ranking. Such meta-analysis is not endorsed by the OECD, although official summaries sometimes use scores from a testing cycle's principal domain as a proxy for overall student ability.

PISA 2012[edit]

The results for the 2012 "Maths" section on a world map.
The results for the 2012 "Science" section on a world map.
The results for the 2012 "Reading" section on a world map.
OECD members as of the time of the study are in boldface.
Mathematics Science Reading
1 China Shanghai, China 613
2  Singapore 573
3  Hong Kong, China 561
4  Taiwan 560
5  South Korea 554
6  Macau, China 538
7  Japan 536
8  Liechtenstein 535
9   Switzerland 531
10  Netherlands 523
11  Estonia 521
12  Finland 519
13=  Canada 518
13=  Poland 518
15  Belgium 515
16  Germany 514
17  Vietnam 511
18  Austria 506
19  Australia 504
20=  Ireland 501
20=  Slovenia 501
22=  Denmark 500
22=  New Zealand 500
24  Czech Republic 499
25  France 495
26  United Kingdom 494
27  Iceland 493
28  Latvia 491
29  Luxembourg 490
30  Norway 489
31  Portugal 487
32  Italy 485
33  Spain 484
34=  Russia 482
34=  Slovakia 482
36  United States 481
37  Lithuania 479
38  Sweden 478
39  Hungary 477
40  Croatia 471
41  Israel 466
42  Greece 453
43  Serbia 449
44  Turkey 448
45  Romania 445
46  Cyprus 440
47  Bulgaria 439
48  United Arab Emirates 434
49  Kazakhstan 432
50  Thailand 427
51  Chile 423
52  Malaysia 421
53  Mexico 413
54  Montenegro 410
55  Uruguay 409
56  Costa Rica 407
57  Albania 394
58  Brazil 391
59=  Argentina 388
59=  Tunisia 388
61  Jordan 386
62=  Colombia 376
62=  Qatar 376
64  Indonesia 375
65  Peru 368
1 China Shanghai, China 580
2  Hong Kong, China 555
3  Singapore 551
4  Japan 547
5  Finland 545
6  Estonia 541
7  South Korea 538
8  Vietnam 528
9  Poland 526
10=  Liechtenstein 525
10=  Canada 525
12  Germany 524
13  Taiwan 523
14=  Netherlands 522
14=  Ireland 522
16=  Macau, China 521
16=  Australia 521
18  New Zealand 516
19   Switzerland 515
20=  Slovenia 514
20=  United Kingdom 514
22  Czech Republic 508
23  Austria 506
24  Belgium 505
25  Latvia 502
26  France 499
27  Denmark 498
28  United States 497
29=  Spain 496
29=  Lithuania 496
31  Norway 495
32=  Italy 494
32=  Hungary 494
34=  Luxembourg 491
34=  Croatia 491
36  Portugal 489
37  Russia 486
38  Sweden 485
39  Iceland 478
40  Slovakia 471
41  Israel 470
42  Greece 467
43  Turkey 463
44  United Arab Emirates 448
45  Bulgaria 446
46=  Serbia 445
46=  Chile 445
48  Thailand 444
49  Romania 439
50  Cyprus 438
51  Costa Rica 429
52  Kazakhstan 425
53  Malaysia 420
54  Uruguay 416
55  Mexico 415
56  Montenegro 410
57  Jordan 409
58  Argentina 406
59  Brazil 405
60  Colombia 399
61  Tunisia 398
62  Albania 397
63  Qatar 384
64  Indonesia 382
65  Peru 373
1 China Shanghai, China 570
2  Hong Kong, China 545
3  Singapore 542
4  Japan 538
5  South Korea 536
6  Finland 524
7=  Taiwan 523
7=  Canada 523
7=  Ireland 523
10  Poland 518
11=  Liechtenstein 516
11=  Estonia 516
13=  Australia 512
13=  New Zealand 512
15  Netherlands 511
16=  Macau, China 509
16=   Switzerland 509
16=  Belgium 509
19=  Germany 508
19=  Vietnam 508
21  France 505
22  Norway 504
23  United Kingdom 499
24  United States 498
25  Denmark 496
26  Czech Republic 493
27=  Austria 490
27=  Italy 490
29  Latvia 489
30=  Luxembourg 488
30=  Portugal 488
30=  Spain 488
30=  Hungary 488
34  Israel 486
35  Croatia 485
36=  Iceland 483
36=  Sweden 483
38  Slovenia 481
39=  Lithuania 477
39=  Greece 477
41=  Russia 475
41=  Turkey 475
43  Slovakia 463
44  Cyprus 449
45  Serbia 446
46  United Arab Emirates 442
47=  Thailand 441
47=  Chile 441
47=  Costa Rica 441
50  Romania 438
51  Bulgaria 436
52  Mexico 424
53  Montenegro 422
54  Uruguay 411
55  Brazil 410
56  Tunisia 404
57  Colombia 403
58  Jordan 399
59  Malaysia 398
60=  Argentina 396
60=  Indonesia 396
62  Albania 394
63  Kazakhstan 393
64  Qatar 388
65  Peru 384

PISA 2012 was presented on 3 December 2013, with results for around 510,000 participating students in all 34 OECD member countries and 31 partner countries.[2] This testing cycle had a particular focus on mathematics, where the mean score was 494. A sample of 1,688 students from Puerto Rico took the assessment, scoring 379 in math, 404 in reading and 401 in science.[26] A subgroup of 44 countries and economies with about 85 000 students also took part in an optional computer-based assessment of problem solving.[27]

Shanghai had the highest score in all three subjects. It was followed by Singapore, Hong Kong, Chinese Taipei and Korea in mathematics; Hong Kong, Singapore, Japan and Korea in reading and Hong Kong, Singapore, Japan and Finland in science.

They were a sample of about 28 million in the same age group in 65 countries and economies,[28] including the OECD countries, several Chinese cities, Vietnam, Indonesia and several countries in South America.[2]

The test lasted two hours, was paper-based and included both open-ended and multiple-choice questions.[28]

The students and school staff also answered a questionnaire to provide background information about the students and the schools.[2][28]

PISA 2012 was presented on 3 December 2013, with results for around 510,000 participating students in all 34 OECD member countries and 31 partner countries.[2] This testing cycle had a particular focus on mathematics, where the mean score was 494. The mean score in reading was 496 and in science 501.

The results show distinct groups of high-performers in mathematics: the East Asian countries, with Shanghai, scoring the best result of 613, followed closely by Hong Kong, Japan, Chinese Taipei and South Korea. Among the Europeans, Liechtenstein and Switzerland performed best, with Netherlands, Estonia, Finland, Poland, Belgium, Germany, Austria all posting mathematics scores "not significantly statistically different from" one another. The United Kingdom, Ireland, Australia and New Zealand were similarly clustered around the OECD average of 494, with the USA trailing this group at 481.[2]

Qatar, Kazakhstan and Malaysia were the countries which showed the greatest improvement in mathematics. The USA and the United Kingdom showed no significant change.[29] Sweden had the greatest fall in mathematics performance over the last ten years, with a similar falling trend also in the two other subjects, and leading politicians in Sweden expressed great worry over the results.[30][31]

On average boys scored better than girls in mathematics, girls scored better than boys in reading and the two sexes had quite similar scores in science.[29]

Indonesia, Albania, Peru, Thailand and Colombia were the countries where most students reported being happy at school, while students in Korea, the Czech Republic, the Slovak Republic, Estonia and Finland reported least happiness.[28]

PISA 2015[edit]

PISA 2015 was presented on 6 December 2016, with results for around 540,000 participating students in 72 countries, with Singapore emerging as the top performer.[32]

Maths Science Reading
1  Singapore 564
2  Hong Kong 548
3  Macau 544
4  Taiwan 542
5  Japan 532
6  China 531
7  South Korea 524
8   Switzerland 521
9  Estonia 520
10  Canada 516
11  Netherlands 512
12  Denmark 511
13  Finland 511
14  Slovenia 510
15  Belgium 507
16  Germany 506
17  Poland 504
18  Ireland 504
19  Norway 502
20  Austria 497
21  New Zealand 495
22  Vietnam 495
23  Russia 494
24  Sweden 494
25  Australia 494
26  France 493
27  United Kingdom 492
28  Czech Republic 492
29  Portugal 492
30  Italy 490
31  Iceland 488
32  Spain 486
33  Luxembourg 486
34  Latvia 482
35  Malta 479
36  Lithuania 478
37  Hungary 477
38  Slovakia 475
39  Israel 470
40  United States 470
41  Croatia 464
42  Kazakhstan 460
43  Greece 454
44  Malaysia 446
45  Romania 444
46  Bulgaria 441
47  Cyprus 437
48  United Arab Emirates 427
49  Chile 423
50  Turkey 420
51  Moldova 420
52  Uruguay 418
53  Montenegro 418
54  Trinidad and Tobago 417
55  Thailand 415
56  Albania 413
57  Argentina 409
58  Mexico 408
59  Georgia 404
60  Qatar 402
61  Costa Rica 400
62  Lebanon 396
63  Colombia 390
64  Peru 387
65  Indonesia 386
66  Jordan 380
67  Brazil 377
68  Macedonia 371
69  Tunisia 367
70  Kosovo 362
71  Algeria 360
72  Dominican Republic 328
1  Singapore 556
2  Japan 538
3  Estonia 534
4  Taiwan 532
5  Finland 531
5  Macau 529
6  Canada 528
7  Vietnam 525
8  Hong Kong 523
9  China 518
10  South Korea 516
11  New Zealand 513
12  Slovenia 513
13  Australia 510
14  United Kingdom 509
15  Germany 509
16  Netherlands 509
17   Switzerland 506
18  Ireland 503
19  Belgium 502
20  Denmark 502
21  Poland 501
22  Portugal 501
23  Norway 498
24  United States 496
25  Austria 495
26  France 495
27  Sweden 493
28  Czech Republic 493
29  Spain 493
30  Latvia 490
31  Russia 487
32  Luxembourg 483
33  Italy 481
34  Hungary 477
35  Lithuania 475
36  Croatia 475
37  Iceland 473
38  Israel 467
39  Malta 465
40  Slovakia 461
41  Kazakhstan 456
42  Greece 455
43  Chile 447
44  Bulgaria 446
45  Malaysia 443
46  United Arab Emirates 437
47  Uruguay 435
48  Romania 435
49  Cyprus 433
50  Argentina 432
51  Moldova 428
52  Albania 427
53  Turkey 425
54  Trinidad and Tobago 425
55  Thailand 421
56  Costa Rica 420
57  Qatar 418
58  Colombia 416
59  Mexico 404
60  Montenegro 411
61  Georgia 411
62  Jordan 409
63  Indonesia 403
64  Brazil 401
65  Peru 397
66  Lebanon 386
67  Tunisia 386
68  Macedonia 384
69  Kosovo 378
70  Algeria 376
71  Dominican Republic 332
1  Singapore 535
2  Hong Kong 527
3  Canada 527
4  Finland 526
5  Ireland 521
6  Estonia 519
7  South Korea 517
8  Japan 516
9  Norway 513
10  New Zealand 509
11  Germany 509
12  Macau 509
13  Poland 506
14  Slovenia 505
15  Netherlands 503
16  Australia 503
17  Sweden 500
18  Denmark 500
19  France 499
20  Belgium 499
21  Portugal 498
22  United Kingdom 498
23  Taiwan 497
24  United States 497
25  Spain 496
26  Russia 495
27  China 494
28   Switzerland 492
29  Latvia 488
30  Czech Republic 487
31  Croatia 487
32  Vietnam 487
33  Austria 485
34  Italy 485
35  Iceland 482
36  Luxembourg 481
37  Israel 479
38  Lithuania 472
39  Hungary 470
40  Greece 467
41  Chile 459
42  Slovakia 453
43  Malta 447
44  Cyprus 443
45  Uruguay 437
46  Romania 434
47  United Arab Emirates 434
48  Bulgaria 432
49  Malaysia 431
50  Turkey 428
51  Costa Rica 427
52  Trinidad and Tobago 427
53  Kazakhstan 427
54  Montenegro 427
55  Argentina 425
56  Colombia 425
57  Mexico 423
58  Moldova 416
59  Thailand 409
60  Jordan 408
61  Brazil 407
62  Albania 405
63  Qatar 402
64  Georgia 401
65  Peru 398
66  Indonesia 397
67  Tunisia 361
68  Dominican Republic 358
69  Macedonia 352
70  Algeria 350
71  Kosovo 347
72  Lebanon 347

Previous years[edit]

Period Focus OECD countries Partner countries Participating students Notes
2000 Reading 28 4 + 11 265,000 The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002.
2003 Mathematics 30 11 275,000 UK disqualified from data analysis. Also included test in problem solving.
2006 Science 30 27 400,000 Reading scores for US disqualified from analysis due to misprint in testing materials.[33]
2009[34] Reading 34 41 + 10 470,000 10 additional non-OECD countries took the test in 2010.[35][36]
2012[2] Mathematics 34 31 510,000

Reception[edit]

China[edit]

China's participation in the 2012 test was limited to Shanghai, Hong Kong, and Macao as separate entities. In 2012, Shanghai participated for the second time, again topping the rankings in all three subjects, as well as improving scores in the subjects compared to the 2009 tests. Shanghai's score of 613 in mathematics was 113 points above the average score, putting the performance of Shanghai pupils about 3 school years ahead of pupils in average countries. Educational experts debated to what degree this result reflected the quality of the general educational system in China, pointing out that Shanghai has greater wealth and better-paid teachers than the rest of China.[37] Hong Kong placed second in reading and science and third in maths.

China is expected to participate in 2018 as an entire unit. In 2015, four provinces Jiangsu, Guangdong, Beijing, and Shanghai, with a total population of over 230 million, participated as a single entity.[38][39][40] The 2015 Beijing-Shanghai-Jiangsu-Guangdong cohort scored a median 518 in science in 2015, while the 2012 Shanghai cohort scored a median 580.

Critics of PISA counter that in Shanghai and other Chinese cities, most children of migrant workers can only attend city schools up to the ninth grade, and must return to their parents' hometowns for high school due to hukou restrictions, thus skewing the composition of the city's high school students in favor of wealthier local families. A population chart of Shanghai reproduced in The New York Times shows a steep drop off in the number of 15-year-olds residing there.[41] According to Schleicher, 27% of Shanghai's 15-year-olds are excluded from its school system (and hence from testing). As a result, the percentage of Shanghai's 15-year-olds tested by PISA was 73%, lower than the 89% tested in the US.[42] Following the 2015 testing, OECD published in depth studies on the education systems of a selected few countries including China.[43]

Finland[edit]

Finland, which received several top positions in the first tests, fell in all three subjects, but remained the best performing country overall in Europe, achieving their best result in science with 545 points (5th) and worst in mathematics with 519 (12th) in which the country was outperformed by four other European countries. The drop in mathematics was 25 points since 2003, the last time mathematics was the focus of the tests. For the first time Finnish girls outperformed boys in the subject, but only narrowly. It was also the first time pupils in Finnish-speaking schools did not perform better than pupils in Swedish-speaking schools. Minister of Education and Science Krista Kiuru expressed concern for the overall drop, as well as the fact that the number of low-performers had increased from 7% to 12%.[44]

India[edit]

India participated in the 2009 round of testing but pulled out of the 2012 PISA testing, in August 2012, with the Indian government attributing its action to the unfairness of PISA testing to Indian students.[45] The Indian Express reported on 9/3/2012 that "The ministry (of education) has concluded that there was a socio-cultural disconnect between the questions and Indian students. The ministry will write to the OECD and drive home the need to factor in India's "socio-cultural milieu". India's participation in the next PISA cycle will hinge on this".[46] The Indian Express also noted that "Considering that over 70 nations participate in PISA, it is uncertain whether an exception would be made for India".

In June 2013, the Indian government, still concerned with the future prospect of fairness of PISA testing relating to Indian students, again pulled India out from the 2015 round of PISA testing.[47]

Sweden[edit]

Sweden's result dropped in all three subjects in the 2012 test, which was a continuation of a trend from 2006 and 2009. In mathematics, the nation had the sharpest fall in mathematic performance over 10 years among the countries that have participated in all tests, with a drop in score from 509 in 2003 to 478 in 2012. The score in reading showed a drop from 516 in 2000 to 483 in 2012. The country performed below the OECD average in all three subjects.[48] The leader of the opposition, Social Democrat Stefan Löfven, described the situation as a national crisis.[49] Along with the party's spokesperson on education, Ibrahim Baylan, he pointed to the downward trend in reading as most severe.[49]

UK[edit]

In the 2012 test, as in 2009, the result was slightly above average for the United Kingdom, with the science ranking being highest (20).[50] England, Wales, Scotland and Northern Ireland also participated as separated entities, showing the worst result for Wales which in mathematics was 43 of the 65 countries and economies. Minister of Education in Wales Huw Lewis expressed disappointment in the results, said that there was no "quick fixes", but hoped that several educational reforms that have been implemented in the last few years would give better results in the next round of tests.[51] The United Kingdom had a greater gap between high- and low-scoring students than the average. There was little difference between public and private schools when adjusted for socio-economic background of students. The gender difference in favour of girls was less than in most other countries, as was the difference between natives and immigrants.[50]

Writing in the Daily Telegraph, Ambrose Evans-Pritchard warned against putting too much emphasis on the UK's international ranking, arguing that an overfocus on scholarly performances in East Asia might have contributed to the area's low birthrate, which he argued could harm the economic performance in the future more than a good PISA score would outweigh.[52]

In 2013, the Times Educational Supplement (TES) published an article, "Is PISA Fundamentally Flawed?" by William Stewart, detailing serious critiques of PISA's conceptual foundations and methods advanced by statisticians at major universities.[53]

In the article, Professor Harvey Goldstein of the University of Bristol was quoted as saying that when the OECD tries to rule out questions suspected of bias, it can have the effect of "smoothing out" key differences between countries. "That is leaving out many of the important things,” he warned. "They simply don't get commented on. What you are looking at is something that happens to be common. But (is it) worth looking at? PISA results are taken at face value as providing some sort of common standard across countries. But as soon as you begin to unpick it, I think that all falls apart."

Queen's University Belfast mathematician Dr. Hugh Morrison stated that he found the statistical model underlying PISA to contain a fundamental, insoluble mathematical error that renders Pisa rankings "valueless".[54] Goldstein remarked that Dr. Morrison's objection highlights “an important technical issue” if not a “profound conceptual error”. However, Goldstein cautioned that PISA has been "used inappropriately", contending that some of the blame for this "lies with PISA itself. I think it tends to say too much for what it can do and it tends not to publicise the negative or the weaker aspects.” Professors Morrison and Goldstein expressed dismay at the OECD's response to criticism. Morrison said that when he first published his criticisms of PISA in 2004 and also personally queried several of the OECS's "senior people" about them, his points were met with “absolute silence” and have yet to be addressed. “I was amazed at how unforthcoming they were,” he told TES. “That makes me suspicious.” “Pisa steadfastly ignored many of these issues,” he says. “I am still concerned.”[55]

Professor Kreiner[who?] agreed: “One of the problems that everybody has with PISA is that they don’t want to discuss things with people criticising or asking questions concerning the results. They didn’t want to talk to me at all. I am sure it is because they can’t defend themselves.[55]

US[edit]

The American result of 2012 was average in science and reading, but lagged behind in mathematics compared to other developed nations. There was little change from the previous test in 2009.[56] The result was described as “a picture of educational stagnation” by Education Secretary Arne Duncan,[57] who said the result was not compatible with the American goal of having the world's best educated workers. Randi Weingarten of the American Federation of Teachers stated that an overemphasis on standardised tests contributed to the lack of improvement in education performance.[58] Dennis Van Roekel of the National Education Association said a failure to address poverty among students had hampered progress.[56]

About 9% of the U.S. students scored in the top two mathematics levels compared to 13% in all countries and economies.[56]

For the first time, three U.S. states participated in the tests as separate entities, with Massachusetts scoring well above both the American and international averages, particularly in reading.[58] An approximate corresponding OECD ranking is shown along with the United States average.[59]

Maths Science Reading
16=  Massachusetts 514
18=  Connecticut 506
36 United States U.S. Average 481
41~  Florida 467
9~  Massachusetts 527
16=  Connecticut 521
28 United States U.S. Average 497
38=  Florida 485
6~  Massachusetts 527
10~  Connecticut 521
24 United States U.S. Average 498
26~  Florida 492

Malaysia[edit]

In 2015, the results from Malaysia were found by the OECD to have not met the minimum response rate.[60] Opposition politician Ong Kian Ming said the education ministry tried to oversample high-performing students in rich schools.[61][62]

Research on possible causes of PISA disparities in different countries[edit]

Although PISA and TIMSS officials and researchers themselves generally refrain from hypothesizing about the large and stable differences in student achievement between countries, since 2000, literature on the differences in PISA and TIMSS results and their possible causes has emerged.[63] Data from PISA have furnished several economists, notably Eric Hanushek, Ludger Woessmann, Heiner Rindermann, and Stephen J. Ceci, with material for books and articles about the relationship between student achievement and economic development,[64] democratization, and health;[65] as well as the roles of such single educational factors as high-stakes exams,[66] the presence or absence of private schools, and the effects and timing of ability tracking.[67]

See also[edit]

References[edit]

  1. ^ Berger, Kathleen. Invitation to The Life Span (second ed.). worth. ISBN 978-1-4641-7205-2. 
  2. ^ a b c d e f g PISA 2012 Results in Focus (PDF), OECD, 3 December 2013, retrieved 4 December 2013 
  3. ^ "Launch of PISA 2015 Results". OECD PISA. Retrieved 2016-08-12. 
  4. ^ a b c d e Rey O, ‘The use of external assessments and the impact on education systems’ in CIDREE Yearbook 2010, accessed January 2017 at http://www.cidree.org/publications/yearbook_2010?PHPSESSID=baip221utd9v77b89hov0s3al6
  5. ^ McGaw, B (2008) ‘The role of the OECD in international comparative studies of achievement’ Assessment in Education: Principles, Policy & Practice, 15:3, 223-243
  6. ^ Mons N, (2008) ‘Évaluation des politiques éducatives et comparaisons internationales’, Revue française de pédagogie, 164, juillet-août-septembre 2008 5-13
  7. ^ a b c d e f Breakspear S ‘The Policy Impact of PISA: An Exploration of the Normative Effects of International Benchmarking in School System Performance’, OECD Education Working Paper number 71, 2012
  8. ^ Barroso, J. and de Carvalho, L.M. (2008) ‘Pisa: Un instrument de régulation pour relier des mondes’, Revue française de pédagogie, 164, 77–80
  9. ^ Ertl, H (2006). 'Educational standards and the changing discourse on education: the reception and consequences of the PISA study in Germany', Oxford Review of Education, 32, 5, 619-634.
  10. ^ Bajomi, I., Berényi, E., Neumann, E. and Vida, J. (2009). ‘The Reception of PISA in Hungary’ accessed January 2017 at http://www.knowandpol.eu/IMG/pdf/pisa.wp12.hungary.pdf
  11. ^ Breakspear S citing Steiner- Khamsi, 2003 in ‘The Policy Impact of PISA: An Exploration of the Normative Effects of International Benchmarking in School System Performance’, OECD Education Working Paper number 71, 2012
  12. ^ Mangez, E. and Cattonar, B. (2009). ‘The status of PISA in the relationship between civil society and the educational sector in French-speaking Belgium’, Sísifo Educational Sciences Journal, 10, 15–26 [online]. Available: http://sisifo.fpce.ul.pt/?r=25t
  13. ^ Greger, D. (2008). ‘Lorsque PISA importe peu. Le cas de la République Tchèque et de l’Allemagne’, Revue française de pédagogie, 164, 91–98. cited in Rey O, ‘The use of external assessments and the impact on education systems’ in CIDREE Yearbook 2010, accessed January 2017 at http://www.cidree.org/publications/yearbook_2010?PHPSESSID=baip221utd9v77b89hov0s3al6
  14. ^ Alfonso, N. and Costa, E. (2009). ‘The influence of the Programme for International Student Assessment (PISA) on policy decision in Portugal: the education policies of the 17th Portuguese Constitutional Government’, Sísifo Educational Sciences Journal, 10, 53–64. Accessed at: http://sisifo.fpce.ul.pt/?r=25
  15. ^ Rautalin M and Alasuutari:(2009): ‘The uses of the national PISA results by Finnish officials in central government’, Journal of Education Policy, 24:5, 539-556
  16. ^ Egelund, N. (2008). ‘The value of international comparative studies of achievement – a Danish perspective’, Assessment in Education: Principles, Policy & Practice, 15, 3, 245–251
  17. ^ Behrens, 2006 cited in Rey O, ‘The use of external assessments and the impact on education systems’ in CIDREE Yearbook 2010, accessed January 2017 at http://www.cidree.org/publications/yearbook_2010?PHPSESSID=baip221utd9v77b89hov0s3al6
  18. ^ Hefling, Kimberly. "Asian nations dominate international test". Yahoo!. 
  19. ^ "Chapter 2 of the publication 'PISA 2003 Assessment Framework'" (pdf). Pisa.oecd.org. 
  20. ^ Keeley B. PISA, we have a problem… OECD Insights, April 2014.
  21. ^ Poddiakov A.N. Complex Problem Solving at PISA 2012 and PISA 2015: Interaction with Complex Reality. // Translated from Russian. Reference to the original Russian text: Poddiakov, A. (2012.) Reshenie kompleksnykh problem v PISA-2012 i PISA-2015: vzaimodeistvie so slozhnoi real'nost'yu. Obrazovatel'naya Politika, 6, 34-53.
  22. ^ C. Füller: Pisa hat einen kleinen, fröhlichen Bruder. taz, 5.12.2007 [1]
  23. ^ Stanat, P; Artelt, C; Baumert, J; Klieme, E; Neubrand, M; Prenzel, M; Schiefele, U; Schneider, W (2002), PISA 2000: Overview of the study—Design, method and results, Berlin: Max Planck Institute for Human Development 
  24. ^ Mazzeo, John; von Davier, Matthias (2013), Linking Scales in International Large-Scale Assessments, chapter 10 in Rutkowski, L. von Davier, M. & Rutkowski, D. (eds.) Handbook of International Large-Scale Assessment: Background, Technical Issues, and Methods of Data Analysis., New York: Chapman and Hall/CRC. 
  25. ^ von Davier, Matthias; Sinharay, Sandip (2013), Analytics in International Large-Scale Assessments: Item Response Theory and Population Models, chapter 7 in Rutkowski, L. von Davier, M. & Rutkowski, D. (eds.) Handbook of International Large-Scale Assessment: Background, Technical Issues, and Methods of Data Analysis., New York: Chapman and Hall/CRC. 
  26. ^ CB Online Staff. "PR scores low on global report card", Caribbean Business, September 26, 2014. Retrieved on January 3, 2015.
  27. ^ OECD (2014): PISA 2012 results: Creative problem solving: Students’ skills in tackling real-life problems (Volume V), http://www.oecd-ilibrary.org/education/pisa-2012-results-skills-for-life-volume-v_9789264208070-en
  28. ^ a b c d PISA 2012 Results OECD. Retrieved 4 December 2013
  29. ^ a b Sedghi, Ami; Arnett, George; Chalabi, Mona (2013-12-03), Pisa 2012 results: which country does best at reading, maths and science?, The Guardian, retrieved 2013-02-14 
  30. ^ Adams, Richard (2013-12-03), Swedish results fall abruptly as free school revolution falters, The Guardian, retrieved 2013-12-03 
  31. ^ Kärrman, Jens (2013-12-03), Löfven om Pisa: Nationell kris, Dagens Nyheter, retrieved 2013-12-03 
  32. ^ Singapore tops latest OECD PISA global education survey, OECD, 6 December 2016, retrieved 13 December 2016 
  33. ^ Baldi, Stéphane; Jin, Ying; Skemer, Melanie; Green, Patricia J; Herget, Deborah; Xie, Holly (2007-12-10), Highlights From PISA 2006: Performance of U.S. 15-Year-Old Students in Science and Mathematics Literacy in an International Context (PDF), NCES, retrieved 2013-12-14, PISA 2006 reading literacy results are not reported for the United States because of an error in printing the test booklets. Furthermore, as a result of the printing error, the mean performance in mathematics and science may be misestimated by approximately 1 score point. The impact is below one standard error. 
  34. ^ PISA 2009 Results: Executive Summary (PDF), OECD, 2010-12-07 
  35. ^ ACER releases results of PISA 2009+ participant economies, ACER, 2011-12-16 
  36. ^ Walker, Maurice (2011), PISA 2009 Plus Results (PDF), OECD, retrieved 2012-06-28 
  37. ^ Tom Phillips (3 December 2013) OECD education report: Shanghai's formula is world-beating The Telegraph. Retrieved 8 December 2013
  38. ^ Coughlan, Sean (26 August 2014). "Pisa tests to include many more Chinese pupils" – via www.bbc.com. 
  39. ^ Harvey Morris (2016-12-06). "Asia dominates world education rankings". China Daily. 
  40. ^ Amy He (2016-12-07). "China's students fall in rank on assessment test". China Daily. 
  41. ^ Helen Gao, "Shanghai Test Scores and the Mystery of the Missing Children", New York Times, January 23, 2014. For Schleicher's initial response to these criticisms see his post, "Are the Chinese Cheating in PISA Or Are We Cheating Ourselves?" on the OECD's website blog, Education Today, December 10, 2013.
  42. ^ William Stewart, "More than a quarter of Shanghai pupils missed by international Pisa rankings", Times Educational Supplement, March 6, 2014.
  43. ^ http://www.oecd.org/china/Education-in-China-a-snapshot.pdf
  44. ^ PISA 2012: Proficiency of Finnish youth declining University of Jyväskylä. Retrieved 9 December 2013
  45. ^ Hemali Chhapia, TNN (3 August 2012). "India backs out of global education test for 15-year-olds". The Times of India. 
  46. ^ "Poor PISA score: Govt blames 'disconnect' with India". The Indian Express. 3 September 2012. 
  47. ^ "India chickens out of international students assessment programme again". The Times of India. 1 June 2013. 
  48. ^ Lars Näslund (3 December 2013) Svenska skolan rasar i stor jämförelse Expressen. Retrieved 4 December 2013 (Swedish)
  49. ^ a b Jens Kärrman (3 December 2013) Löfven om Pisa: Nationell kris Dagens Nyheter. Retrieved 8 December 2013 (Swedish)
  50. ^ a b Adams, Richard (2013-12-03), UK students stuck in educational doldrums, OECD study finds, The Guardian, retrieved 2013-12-04 
  51. ^ Pisa ranks Wales' education the worst in the UK BBC. 3 December 2013. Retrieved 4 December 2013.
  52. ^ Ambrose Evans-Pritchard (3 December 2013) Ambrose Evans-Pritchard Telegraph.co.uk. Retrieved 4 December 2013.
  53. ^ William Stewart, "Is Pisa fundamentally flawed?" Times Educational Supplement, July 26, 2013.
  54. ^ http://www.qub.ac.uk/schools/SchoolofEducation/AboutUs/Staff/Academic/DrHughMorrison/Filestore/Filetoupload,387514,en.pdf
  55. ^ a b Stewart, "Is PISA fundamentally flawed?" TES (2013).
  56. ^ a b c Motoko Rich (3 December 2013) American 15-Year-Olds Lag, Mainly in Math, on International Standardized Tests New York Times. Retrieved 4 December 2013
  57. ^ Simon, Stephanie (2013-12-03), PISA results show "educational stagnation" in US, Politico, retrieved 2013-12-03 
  58. ^ a b Vaznis, James (2013-12-03), Mass. students excel on global examinations, Boston Globe, retrieved 2013-12-14 
  59. ^ 2012 Program for International Student Assessment (PISA) Results (PDF), Massachusetts Department of Education, retrieved 2014-12-11 
  60. ^ "Ong: Did ministry try to rig results for Pisa 2015 report?". 8 December 2016. 
  61. ^ "Who's telling the truth about M'sia's Pisa 2015 scores?". 9 December 2016. 
  62. ^ "Malaysian PISA results under scrutiny for lack of evidence - School Advisor". 8 December 2016. 
  63. ^ Hanushek, Eric A., and Ludger Woessmann. 2011. "The economics of international differences in educational achievement." In Handbook of the Economics of Education, Vol. 3, edited by Eric A. Hanushek, Stephen Machin, and Ludger Woessmann. Amsterdam: North Holland: 89–200.
  64. ^ Hanushek, Eric; Woessmann, Ludger (2008), "The role of cognitive skills in economic development" (PDF), Journal of Economic Literature, 46 (3): 607–668, doi:10.1257/jel.46.3.607 
  65. ^ Rindermann, Heiner; Ceci, Stephen J (2009), "Educational policy and country outcomes in international cognitive competence studies", Perspectives on Psychological Science, 4 (6): 551–577, doi:10.1111/j.1745-6924.2009.01165.x 
  66. ^ Bishop, John H (1997), "The effect of national standards and curriculum-based exams on achievement", American Economic Review, 87 (2): 260–264 
  67. ^ Hanushek, Eric; Woessmann, Ludger (2006), "Does educational tracking affect performance and inequality? Differences-in-differences evidence across countries" (PDF), Economic Journal, 116 (510): C63–C76, doi:10.1111/j.1468-0297.2006.01076.x 

Further reading[edit]

Official websites and reports[edit]

  • OECD/PISA website
    • OECD (1999): Measuring Student Knowledge and Skills. A New Framework for Assessment. Paris: OECD, ISBN 92-64-17053-7 [2]
    • OECD (2014): PISA 2012 results: Creative problem solving: Students’ skills in tackling real-life problems (Volume V) [3]

External links[edit]