Where is your country on the Pisa scale?
Tuesday 3 December 2013
Where is your country on the Pisa list? The Pisa (Programme for International Student Assessment) results were announced today. Pisa is carried out by the Organisation for Economic Cooperation and Development (OECD). It ranks 65 countries in order of how their 15 year old students perform in Maths, English and Reading. Politicians from around the world are now either congratulating themselves on their education policies or berating their schools for dropping down the league in the list which is published every three years. The countries at the top this year include China (actually just the cities of Shanghai and Hong Kong), Singapore and South Korea. Some of the countries that have fallen down the rankings include Sweden and Finland.
Our new programme for IB chemistry which starts in 2014 is firmly rooted in the Nature of Science. A good scientist would question the scientific method behind these results - how much reliability can be placed upon them. The answer is very little indeed. In fact it is found seriously wanting and like the Leaning Tower of Pisa (left) seems to be built on very shaky foundations. A good article written by William Stewart in the Times Educational Supplement entitles “Is Pisa fundamentally flawed” puts the case for criticising both the underlying statistical basis for the figures and their educational validity.
I can only provide anecdotal evidence but it has repeated itself year after year. For the twenty six years I worked at Atlantic College we had students arriving from over 70 countries aged 16 or 17 in order to spend two years at the college where as part of the programme they took the IB Diploma before going on to university. Before that almost all of them had gone through their own national education system. It was always very noticeable that some of them arrived from the same countries each year knowing a huge amount about chemistry and others knew almost nothing. In Pisa terms some of these countries would have scored extremely highly whereas others would have been way down the list. But what was remarkable was that the often the students who had the factual knowledge could not use it to solve problems in unfamiliar areas whereas the ‘weaker’ students often had a much better approach to problem solving. In the first term some of the students from the ‘low on the list countries’ could get demoralised when tested on questions that demanded factual recall but by the time they took the IB two years later many of them had gone on to outdo those from ‘high on the list countries’. There is so much more to education than just getting students to achieve well on routine exercises. It seems to me that education systems that place the emphasis on an all-round balanced curriculum based on critical enquiry benefit students so much more ultimately than those that put the emphasis just on memorising facts. As Richard N. Zare, a Professor in Natural Sciences at Stanford University, has said; “Students do not become brilliant scientists by being excellent at doing the same things other people do. They become brilliant scientists by being excellent in doing different things than other people - …and we will never be able to measure that in standardized tests.”