International education assessments have become the lifeblood of education governance in Europe and globally. However what do we really know about how education systems are measured against one another and the effects this measuring produces? Operating as a new form of global education governance, international assessments create a powerful comparative spectacle focused on the performance and apparent ‘effectiveness’ of education systems around the world; this spectacle is now not only including the global rich but also those countries which are often pejoratively described as ‘developing’. However, and despite international assessments’ dominance and ever-pervasiveness into the logic and planning of education, there are still many areas of critique and complexity: the ways these studies are organised and delivered; the impacts they have through decontextualizing education and quantifying some aspects of it (but not others); the effects they have on what is considered worthy of teaching and knowing; and most importantly, the interlinkages that are silently yet powerfully made through commensurating education with the application of similar policy instruments that measure the economy, the labour market, even health, migration, international development; the list can go on.
Much attention has so far been given to the OECD Programme of International Student Assessment (PISA). But why and how has PISA become such a powerful force in education policy-making? To use a metaphor from the medical sciences, PISA took an apparently rapidly worsening patient (according to the diagnosis of the OECD) – education in Europe – and supplied it with a life-saving, and life-changing, transplant. All the essential parts were already there: an education industry; numerous national experts and statisticians; the believers in linking education with the labour market, as well as its critics; and the indicators that the OECD had already been preparing since the 1970s, as well as other international studies that had prepared the field: the IEA’s Progress in International Reading Literacy Study (PIRLS), Trends in International Mathematics and Science Study (TIMSS) and the previous OECD International Adult Literacy Survey (IALS) and Adult Literacy and Life Skills Survey (ALL) studies. In addition, from a more European point of view, a soft governing tool (with a hard agenda!), the Open Method of Coordination, was also ready to be launched and change the European education policy landscape for good. PISA became the heart that breathed life into this previously disparate body. This heart was beating the beat of comparison and competition, connecting the parts into a single entity, itself represented by the OECD rating and ranking tables. The PISA charts became the totemic representations of the new governing regime, excluding caveats or any awkward knowledge in order to offer policy makers what they are often after – fast-selling policy solutions.
This is the beginnings of a story that has been eloquently described and analysed by a number of academics in the field. The Laboratory of International Assessments was set up to investigate ‘chapter 2’ of this story and ask; now that international assessments are with us (and seem to be with us to stay), what are their long-term effects on education governance in Europe and globally? What do they mean for the knowledge and policy relationship and what do they suggest about the changing politics of education policy in the 21st century? How do policy makers use them (if they do)? Can participation in their organisation and management be more open and democratic or is it that their statistical complexity renders them legible only to the very few? These and many other questions are what we intend discussing over the next couple of years in the Economic and Social Research Council (ESRC) seminar series on ‘The Potentials, Politics and Practices of International Education Assessments’. The first seminar, on ‘Education Governance and International Assessments’ will take place at the University of Edinburgh this December 11 and 12 – it is already oversubscribed, a fact which shows the increasing interest and attention to the phenomenon by the scholarly, policy and testing agency communities. For more commentaries and focused analysis, watch this space – we are only just starting!
We use cookies to ensure that we give you the best experience on blogs.sps.ed.ac.uk. You can adjust all of your cookie settings by navigating the tabs on the left hand side.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.
If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.
Google Analytics
Some sites at blogs.sps.ed.ac.uk use Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages. The data help us improve the experience of using our site.
Please enable Strictly Necessary Cookies first so that we can save your preferences!
Privacy and cookies policy
Please see the School of Social and Political Science's privacy and cookies page. In addition to the cookies described there, blogs.sps.ed.ac.uk also uses the following cookies:
WordPress: wordpress_test_cookie
WordPress is the content management system (CMS) used to build blogs.sps.ed.ac.uk
Wordfence: wordfence_verifiedHuman, wfvt_
Wordfence is a WordPress security package
Twitter: personalization_id, guest_id, external_referer, ct0, _twitter_sess
Twitter is a microblogging social media service
Recent Comments