Programme for International Student Assessment: Wikis

Advertisements
  
  

Note: Many of our articles have direct quotes from sources you can cite, within the Wikipedia article! This article doesn't yet, but we're working on it! See more info or our list of citable articles.

Encyclopedia

From Wikipedia, the free encyclopedia

The Programme for International Student Assessment (PISA) is a worldwide evaluation of 15-year-old school children's scholastic performance, performed first in 2000 and repeated every three years. It is coordinated by the Organisation for Economic Co-operation and Development (OECD), with a view to improving educational policies and outcomes.

Contents

Framework

PISA stands in a tradition of international school studies, undertaken since the late 1950s by the International Association for the Evaluation of Educational Achievement (IEA). Much of PISA's methodology follows the example of the Trends in International Mathematics and Science Study (TIMSS, started in 1995), which in turn was much influenced by the U.S. National Assessment of Educational Progress (NAEP). The reading component of PISA is inspired by the IEA's Progress in International Reading Literacy Study (PIRLS).

PISA aims at testing literacy in three competence fields: reading, mathematics, science.

The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in various real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them (curriculum attainment). PISA claims to measure education's application to real-life problems and life-long learning (workforce knowledge).

In the reading test, "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling". Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts"[1]

Development and implementation

Developed from 1997, the first PISA assessment was carried out in 2000. The results of each period of assessment take about one year and half to be analysed. First results were published in November 2001. The release of raw data and the publication of technical report and data handbook took only place in spring 2002. The triennial repeats follow a similar schedule; the process of seeing through a single PISA cycle, start-to-finish, always takes over four years.

Every period of assessment focusses on one of the three competence fields reading, math, science; but the two others are tested as well. After nine years, a full cycle is completed: after 2000, reading is again the main domain in 2009.

Period Main focus # OECD countries # other countries # students Notes
2000 Reading 28 4 265,000 Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002
2003 Mathematics 30 11 275,000 UK disqualified from data analysis. Also included test in problem solving.
2006 Science 30 27
2009 Reading 30 33? results will be published in fall 2010

PISA is sponsored, governed, and coordinated by the OECD. The test design, implementation, and data analysis is delegated to an international consortium of research and educational institutions led by the Australian Council for Educational Research (ACER). ACER leads in developing and implementing sampling procedures and assisting with monitoring sampling outcomes across these countries. The assessment instruments fundamental to PISA’s Reading, Mathematics, Science, Problem-solving, Computer-based testing, background and contextual questionnaires are similarly constructed and refined by ACER. ACER also develops purpose-built software to assist in sampling and data capture, and analyses all data. The source code of the data analysis software is not made public.

Method of testing

Advertisements

Sampling

The students tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006 , however, several countries also used a grade-based sample of students. This made it possible also to study how age and school year interact.

To fulfill OECD requirements, each country must draw a sample of at least 5,000 students. In small countries like Iceland and Luxembourg, where there are less than 5,000 students per year, an entire age cohort is tested. Some countries used much larger samples than required in order to allow comparisons between regions.

The test

PISA test documents on a school table (Neues Gymnasium, Oldenburg, Germany, 2006)

Each student takes a two-hour handwritten test. Part of the test is multiple-choice and part involves fuller answers. In total there are six and a half hours of assessment material, but each student is not tested on all the parts. Following the cognitive test, participating students spend nearly one more hour answering a questionnaire on their background including learning habits, motivation and family. School directors also fill in a questionnaire describing school demographics, funding etc.

In selected countries, PISA started also experimentation with computer adaptive testing.

National add-ons

Countries are allowed to combine PISA with complementary national tests.

Germany does this in a very extensive way: on the day following the international test, students take a national test called PISA-E (E=Ergänzung=complement). Test items of PISA-E are closer to TIMSS than to PISA. While only about 5,000 German students participate in both the international and the national test, another 45,000 take only the latter. This large sample is needed in order to allow an analysis by federal states. Following a clash about the interpretation of 2006 results, the OECD warned Germany that it might withdraw the right to use the "PISA" label for national tests.[2]

Data Scaling

From the beginning, PISA has been designed with one particular method of data analysis in mind. Since students work on different test booklets, raw scores must be scaled to allow meaningful comparisons. This scaling is done using the Rasch model of item response theory (IRT). According to IRT, it is not possible to assess the competence of students who solved none or all of the test items. This problem is circumvented by imposing a Gaussian prior probability distribution of competences.[3]

One and the same scale is used to express item difficulties and student competences. The scaling procedure is tuned such that the a posteriori distribution of student competences, with equal weight given to all OECD countries, has mean 500 and standard deviation 100.

Results

League Tables

All PISA results are broken down by countries. Public attention concentrates on just one outcome: achievement mean values by countries. These data are regularly published in form of "league tables".

The following table gives the mean achievements of OECD member countries in the principal testing domain of each period:[4]

2000 2003 2006
Reading literacy Mathematics Science
1.  Finland 546
2.  Canada 534
3.  New Zealand 529
4.  Australia 528
5.  Ireland 527
6.  South Korea 525
7.  United Kingdom 523
8.  Japan 522
9.  Sweden 516
10.  Austria 507
11.  Belgium 507
12.  Iceland 507
13.  Norway 505
14.  France 505
15.  United States 504
16.  Denmark 497
17.  Switzerland 494
18.  Spain 493
19.  Czech Republic 492
20.  Italy 487
21.  Germany 484
22.  Hungary 480
23.  Poland 479
24.  Greece 474
25.  Portugal 470
26.  Luxembourg 441
27.  Mexico 422
1.  Finland 544
2.  South Korea 542
3.  Netherlands 538
4.  Japan 534
5.  Canada 532
6.  Belgium 529
7.  Switzerland 527
8.  Australia 524
9.  New Zealand 523
10.  Czech Republic 516
11.  Iceland 515
12.  Denmark 514
13.  France 511
14.  Sweden 503
15.  Austria 506
16.  Germany 503
17.  Ireland 503
18.  Slovakia 498
19.  Norway 495
20.  Luxembourg 493
21.  Poland 490
22.  Hungary 490
23.  Spain 485
24.  United States 483
25.  Italy 466
26.  Portugal 466
27.  Greece 445
28.  Turkey 423
29.  Mexico 385
1.  Finland 563
2.  Canada 534
3.  Japan 531
4.  New Zealand 530
5.  Australia 527
6.  Netherlands 525
7.  South Korea 522
8.  Germany 516
9.  United Kingdom 515
10.  Czech Republic 513
11.  Switzerland 512
12.  Austria 511
13.  Belgium 510
14.  Ireland 508
15.  Hungary 504
16.  Sweden 503
17.  Poland 498
18.  Denmark 496
19.  France 495
20.  Iceland 491
21.  United States 489
22.  Slovakia 488
23.  Spain 488
24.  Norway 487
25.  Luxembourg 486
26.  Italy 475
27.  Portugal 474
28.  Greece 473
29.  Turkey 424
30.  Mexico 410

In the official reports, country rankings are communicated in a more elaborate form: not as lists, but as cross tables, indicating for each pair of countries whether or not mean score differences are statistically significant (unlikely to be due to random fluctuations in student sampling or in item functioning). In favorable cases, a difference of 9 points is sufficient to be considered significant.

In some popular media, test results from all three literacy domains have been consolidated in an overall country ranking. Such meta-analysis is not endorsed by the OECD. The official reports only contain domain-specific country scores. In part of the official reports, however, scores from a period's principal testing domain are used as proxy for overall student ability.[5]

Comparison with other studies

The correlation between PISA 2003 and TIMSS 2003 grade 8 country means is 0.84 in math, 0.95 in science. The values go down to 0.66 and 0.79 if the two worst performing developing countries are excluded. Western countries generally performed better in PISA; Eastern European and Asian countries in TIMSS. Content balance and years of schooling explain most of the variation.[6]

Topical studies

An evaluation of the 2003 results showed that the countries which spent more on education did not necessarily do better than those which spent less. Australia, Belgium, Canada, the Czech Republic, Finland, Japan, Korea, New Zealand and the Netherlands spent less but did relatively well, whereas the United States spent much more but was below the OECD average. The Czech Republic, in the top ten, spent only one third as much per student as the United States did, for example, but the USA came 24th out of 29 countries compared.

Another point made in the evaluation was that students with higher-earning parents are better-educated and tend to achieve higher results. This was true in all the countries tested, although more obvious in certain countries, such as Germany.

Reception

For many countries, the first PISA results were surprising; in Germany and the United States, for example, the comparatively low scores brought on heated debate about how the school system should be changed.[citation needed] Some headlines in national newspapers, for example, were:

Criticism

Criticism has ensued in Luxembourg which scored quite low, over the method used in its PISA test. Although being a trilingual country, the test was not allowed to be done in Luxembourgish, the mother tongue of a majority of students[citation needed].

Minor criticism has ensued in Denmark over the fact, that the PISA test focuses on immediately measurable skills such as reading, writing, spelling and calculating abilities, and thus neglects the less measurable factors such as social skills and development, skills that the Danish school system values highly.

References

  1. ^ Chapter 2 of the publication "PISA 2003 Assessment Framework", pdf
  2. ^ C. Füller: Pisa hat einen kleinen, fröhlichen Bruder. taz, 5.12.2007 [1]
  3. ^ The scaling procedure is described in nearly identical terms in the Technical Reports of PISA 2000, 2003, 2006. It is similar to procedures employed in NAEP and TIMSS. According to J. Wuttke Die Insignifikanz signifikanter Unterschiede. (2007, in German), the description in the Technical Reports is incomplete and plagued by notational errors.
  4. ^ OECD (2001) p. 53; OECD (2004a) p. 92; OECD (2007) p. 56.
  5. ^ E.g. OECD (2001), chapters 7 and 8: Influence of school organization and socio-economic background upon performance in the reading test. Reading was the main domain of PISA 2000.
  6. ^ M. L. Wu: A Comparison of PISA and TIMSS 2003 achievement results in Mathematics. Paper presented at the AERA Annual Meeting, New York, March, 2008. [2].

Further reading

Official websites and reports

  • OECD/PISA website (Javascript required)
    • OECD (1999): Measuring Student Knowledge and Skills. A New Framework for Assessment. Paris: OECD, ISBN 92-64-17053-7 [3]
    • OECD (2001): Knowledge and Skills for Life. First Results from the OECD Programme for International Student Assessment (PISA) 2000.
    • OECD (2003a): The PISA 2003 Assessment Framework. Mathematics, Reading, Science and Problem Solving Knowledge and Skills. Paris: OECD, ISBN 978-92-64-10172-2 [4]
    • OECD (2004a): Learning for Tomorrow's World. First Results from PISA 2003. Paris: OECD, ISBN 978-92-64-00724-6 [5]
    • OECD (2004b): Problem Solving for Tomorrow's World. First Measures of Cross-Curricular Competencies from PISA 2003. Paris: OECD, ISBN 978-92-64-00642-3
    • OECD (2005): PISA 2003 Technical Report. Paris: OECD, ISBN 978-92-64-01053-6
    • OECD (2007): Science Competencies for Tomorrow's World: Results from PISA 2006 [6]

About reception and political consequences

  • General:
    • A. P. Jakobi, K. Martens: Diffusion durch internationale Organisationen: Die Bildungspolitik der OECD. In: K. Holzinger, H. Jörgens, C. Knill: Transfer, Diffusion und Konvergenz von Politiken. VS Verlag für Sozialwissenschaften, 2007.
  • France:
    • N. Mons, X. Pons: The reception and use of Pisa in France.
  • Germany:
    • E. Bulmahn [then federal secretary of education]: PISA: the consequences for Germany. OECD observer, no. 231/232, May 2002. pp. 33-34.
    • H. Ertl: Educational Standards and the Changing Discourse on Education: The Reception and Consequences of the PISA Study in Germany. Oxford Review of Education, v32 n5 p619-634 Nov 2006.
  • United Kingdom:
    • S. Grek, M. Lawn, J. Ozga: Study on the Use and Circulation of PISA in Scotland. [7]

Criticism

  • Books:
    • S. Hopmann, G. Brinek, M. Retzl (eds.): PISA zufolge PISA. PISA According to PISA. LIT-Verlag, Wien 2007, ISBN 3-8258-0946-3 (partly in German, partly in English)
    • T. Jahnke, W. Meyerhöfer (eds.): PISA & Co – Kritik eines Programms. Franzbecker, Hildesheim 2007 (2nd edn.), ISBN 978-3-88120-464-4 (in German)
  • Websites:

Simple English

The Programme for International Student Assessment is a test done by the Organisation for Economic Co-operation and Development (OECD). It is done every 3 years. In the test, 15-year-old schoolchildren are tested in different subjects. Their grades can then be compared. It is used to say that a certain school (or schooling system) does better or worse than a different one, on a certain subject.

Other websites


Advertisements






Got something to say? Make a comment.
Your name
Your email address
Message