The Influence of PISA on Educational Policy in Canada: Take a Deep Breath Pierre Brochu The Programme for International Student Assessment is the most widely recognized international assessment of its kind. Designed to test student knowledge and skills in the core subject areas of reading, mathematics, and science, PISA provides policy-oriented indicators of the basic competencies of youth before the end of compulsory schooling. In just over 10 years, PISA has become the ultimate reference for international, large-scale assessment. Its influence in government, in the education community, in the media, and with the general public has grown exponentially since 2000. In addition, by 2015, each of the major domains will have been tested twice, thus enabling participating countries not only to compare themselves with other education systems but also to compare their own results over time, thanks to the nine-year cycle (Table 1). Table 1: PISA Assessment Cycle 2000 2003 2006 2009 2012 20151 20182 Majör De- Reading Mathe- Science Reading Mathe- Science Reading main matics matics Minör D0-mains Mathematics Reading Science Reading Mathe- Mathematics Reading Science Reading Mathe- Mathematics Science matics Science matics Science Supplemental D0main Problem Sülving Computer-Based Science Computer-Based Reading C0mput-er-Based Pr0blem S0lving Collaborative Problem Solving To be determined 1 See http://www.0ecd.0rg/pisa/pisapr0ducts/pisa2015draftframew0rks.htm 2 See http://www.0ecd.0rg/pisa/pisapr0ducts/PISA-2018-d0cuments-f0r-bidders.htm 73 ŠOLSKO POLJE, LETNIK XXV, ŠTEVILKA 5-6 The objective of this paper is to demonstrate how PISA results can be used in Canada — where multiple ministries/departments are responsible for education — by drawing on the experiences of selected countries, most of which have a similar federal structure. It presents cases studies of how several countries have reacted and responded to PISA results over the last decade to plan education reform and improve education. Firstly, I will briefly describe the particularity of the PISA administration in Canada and summarize the Canadian results since 2000. Secondly, I will provide several examples of how other countries with either a similar political structure or similar challenges have used their PISA results to implement the changes necessary to respond to these results. Thirdly, I will make some observations on the growing use of trend data related to PISA and how Canadian provinces have responded to the PISA results thus far. Finally, I will argue that, among the lessons learned from PISA, Canadian education systems can derive the greatest benefit from the experiences in their own country as well as from those in other countries by considering a long-term perspective in order to orient educational policy in a constructive and responsible manner. PISA in Canada Canada has participated in PISA since its inception in 2000. Approximately 20,000 students from 1,000 schools in the 10 provinces are typically evaluated (Brochu et al., 2013). This sample size is significantly larger than that of most other countries, owing to the jurisdictional and linguistic make-up of Canada's education systems. Education is the exclusive responsibility of the provinces and territories in Canada and is delivered in both English and French. A large sample is thus required in order to provide statistically reliable results for every education system in the country, as well as for both language groups. Results of the initial PISA survey in 2000 were, for many countries, surprising. As will be explained later, some countries, such as Finland, South Korea, Japan, and Canada, performed well above the OECD average, which was duly noted by the media. Others, however, such as Germany, the United Kingdom, and the United States, fell far short of where they expected to be in the world's education league tables. In Canada, the overall results of PISA have generally been reassuring, as its average scores have regularly been well above the OECD averages (Table 2); in fact, only Finnish students have consistently outperformed Canadian 15-year-olds since 2000, although Asian economies have taken over the lead most recently (OECD, 2013a). 74 p. BROCHU ■ THE INFLUENCE OF PISA ON EDUCATIONAL POLICY IN CANADA ... Table 2: PISA 2000 to 2012 - Average Score (standard error), Canada and the OECD 2000 2003 200« 2009 2012 OECD Canada OECD Canada OECD Canada OECD Canada OECD Canada Reading 500 534 494 528 492 527 493 524 49« 523 (0.«) (1.«) (0.«) (5.«) (0.«) (5.5) (0.5) (5.2) (0.5) («.2) Mathe- 500 532 498 527 49« 527 494 518 matics (0.«) (1.8) (0.5) (2.4) (0.5) (2.«) (0.5) (2.7) Science 500 534 501 529 501 525 (0.5) (2.0) (0.5) (3.0) (0.5) (4.0) Canada also stands out not only for attaining high results but also because of the considerable equity in achievement (OECD, 2011b), as shown in Table 3. The country has been cited as a model for permitting students to reach their full potential as constructive and reflective citizens regardless of the school they attend (OECD, 2013b), as demonstrated by the many measures of equity used by PISA: a low proportion of low achievers; a relatively small achievement gap between high and low achievers; a small proportion of variance explained by between-school differences; a weak relationship between performance and socioeconomic status; and a small gap between students from an immigrant background and those born in the country (Levin, 2012). Table 3: PISA 2012 Mathematics - Selected Measures of Equity, Canada and the OECD Canada OECD Proportion of 15-year-olds below level 2 14% 23% Gap between 90th and 10th percentiles 231 points 239 points Proportion of total variation explained by between-school variance 18.4% 3«.8% Percentage of variance explained by socioeconomic status 9.4% 14.«% G ap between non-immigrant and first-generation immigrant students -« points +45 points These results are particularly interesting given that Canada is the only OECD country without a central (federal) ministry/department of education; since, by definition, centralization can facilitate the creation of equitable education policies and help to ensure equitable resource allocation. Other countries with a federal presence in education such as the United States or Germany (discussed below) have generally achieved average performance on PISA since 2000 with far less equity than Canada. However, as argued by Wallner (2013), the high degree of equity in Canada may well be a consequence of decentralization, as the Canadian sys- 75 ŠOLSKO POLJE, LETNIK XXV, ŠTEVILKA 5-6 tems allow provinces and territories to adapt their policies, curricula, and resource allocation to the specific needs of their populations. That being said, there is a measure of equity for federal countries that has not drawn a lot of attention internationally, but does warrant a closer look for Canada. In 2000, the gap in reading between the lowest and highest achieving provinces was 49 points. In 2012, the gap was 45 points, suggesting slightly more equity. However, this equity came at the cost of achievement, in that the highest achieving provinces reached 15 points less than in 2000 and the lowest achieving province, 6 points less. PISA Results in Other Countries Other federal countries participating in PISA have generally shown results much lower than Canada's. Germany, the United Kingdom, the United States, and Spain have all seen their results close to the OECD average, while Australia and New Zealand have been slightly above. Since the mid-1950s, Germany has stood out as a world leader in higher education and as one of a handful of countries where compulsory education has been well established for the past half century (UNESCO, 2000). However, the initial PISA 2000 results created what has since been referred to as "PISA shock." The OECD PISA 2000 ranking had a huge impact in the country to the point where it "stopped the complacency and self-confidence with which Germany had looked at its education system for too long" (Der Spiegel, as cited in Drager, 2012, p. 5). Facing results that placed the country below the OECD average, both orders of government (federal and lander) proposed urgent reforms, which focused on outputs and Germany's international competitiveness (Martens and Niemann, 2010) and laid great emphasis on empirical research and pedagogic practice (Ertl, 2006). They included a significant increase in student testing, changes to curricula, increases in funding, and additional measures of quality control (Grek, 2009; Anderson, Chiu and Yore, 2010; Neumann, Fisher and Kauertz, 2010). Interestingly, the most recent PISA results (2012) have confirmed the significant improvement in Germany's PISA average scores and more equity in education outcomes (OECD, 2013c). It is worth noting, however, that the streaming of students (a notable feature of the German education system that has been strongly criticized in some quarters) remains untouched. The earlier PISA results also triggered strong reactions in the United Kingdom. While the UK's participating entities (England, Scotland, and Northern Ireland) each registered areas of positive outcomes, the results were less encouraging in aggregate. Since the first round of PISA, the UK's performance has been portrayed as "at best stagnant, at worst de- 76 p. BROCHU ■ THE INFLUENCE OF PISA ON EDUCATIONAL POLICY IN CANADA ... clining" (Chakrabarti, 2013; Coughlan, 2013), with teacher qualifications and school autonomy being given, among others, as possible reasons for the lower achievement; nonetheless, this did not lead to concrete policy change (Baird et al., 2011). However, it has also been argued that PISA was a catalyst for an increase in testing with an explicit reference to PISA-related performance targets in Ireland (Breakspear, 2012; Figazzolo, 2009). In the United States, however, the "very average" results from the early rounds of PISA were largely ignored by the American education community, policy makers, and the media. This may have been due, in part, to the fact that the PISA sample was relatively small: like Canada, the United States maintains a decentralized education system (albeit with a significant federal presence), and the PISA samples did not yield results that could be analysed at a state level. While little has actually been done to reform education in America based on PISA findings, more attention is being paid to them, as seen by the positions taken by the U.S. Secretary of Education (Duncan, 2013) and expressed at the recent International Summit on the Teaching Profession. PISA-related discourse in the U.S. has been not on "spending more" but on "spending more wisely," in recognition of the fact that the United States is second only to Luxembourg in terms of per-student spending on education (Paine and Schleicher, 2011). Specifically, in an extensive comparative analysis of PISA results in the United States with those in high performing countries, it has been argued that resources need to be redirected to socioeconomically disadvantaged schools (Merry, 2013), teacher salaries (OECD, 2011a), and programs that increase teacher effectiveness (Hanushek, in Froese-Germain, 2010, p. 18). In Spain, results have been characterized by lower achievement and lower equity both between regions and between sub-populations with no tangible improvement over time (OECD 2013d). Identifying the factors that drive PISA results in high-performing countries is difficult (OECD, 2011a). Education systems are highly complex, and virtually any combination of their elements can be cited in explaining PISA results (whether strong or weak). As explained by Figazzolo (2009), "Taken as they are, which is (...) very often what happens, PISA results can be used to support A as well as the opposite of A" (p. 28). Thus, it is advisable not to look at systems or factors in isolation, but rather to consider how a combination of factors works to produce high performance — and whether this combination of factors can be replicated in other similar contexts. This has clearly been how the OECD has elected to portray individual country results and how many countries have used PISA results to further their political agendas. The pressure created by PISA about learn- 77 ŠOLSKO POLJE, LETNIK XXV, ŠTEVILKA 5-6 ing from the best (i.e., highest-performing countries based on PISA) has triggered the emergence of a new phenomenon labelled "educational tourism" (Robelen, 2013), where high-performing countries are visited by delegations from lower-performing countries. A case in point is Finland. Its leading performance in PISA since the initial round has generated an exceptional amount of interest and has been attributed to a variety of factors: non-differentiation (i.e., no tracking or streaming of students); highly qualified and respected teachers; the absence of high-stakes national assessments; and a decision-making process for curriculum and teaching approaches that is decentralized and school-based (Valijarvi et al., 2002; Malaty, 2012). Obviously, other countries have these factors in place to some extent, but, as with any good recipe, the Finnish secret lies in having the right ingredients, in the right amount, and in the right context. Not surprisingly, many education stakeholders from around the world wished to emulate Finland's results after the first round of PISA. However, they tended to focus on those factors that furthered specific political, educational, or economic agendas. Teacher unions, for example, cited the absence of a testing regime or the presence of highly educated teachers in Finland (OSSTF, 2007; Figazzolo, 2009). Other stakeholders pointed to different factors, such as the absence of streaming (as compared to Germany, among others) (Ammermüller, 2005); a homogeneous population (Entorf and Minoiu, 2005); the flexible curriculum and school structure (OECD, 2011a); and the late entry point for compulsory schooling (Mead, 2008). The Finnish model contrasts with another successful system, namely that of South Korea, which has been used to justify very different policies (Pearson, 2012). These include long study hours (Chakrabarti, 2013); private investment in education (Lloyd, 2012); a combination of high expectations and a curriculum that emphasizes creativity and problem solving (Marginson, 2013); and (unlike Finland) a strong culture of testing (Dal-porto 2013). There are also countries where results have been fluctuating over time. This is the case in Japan, whose stellar results in the initial round of PISA were followed by a decrease in reading in 2003 and another decrease in mathematics in 2006. Japanese policy makers responded swiftly to these declines by initiating a multi-year plan to improve reading followed by the implementation of national tests and a national curriculum review (Ninomiya and Urabe, 2011). 78 p. BROCHU ■ THE INFLUENCE OF PISA ON EDUCATIONAL POLICY IN CANADA ... Trends: The New Yardstick in Education Effectiveness During the first few cycles of PISA, country ranking was the most commonly reported result in the media (as opposed to average score or the proportion of students at certain performance levels). In 2009, however, the emphasis started to shift, with greater interest being given to changes in a country's results over time. This new focus was in part a response to the changes in the countries participating in PISA from one cycle to the next: with new (often high-performing) countries and economies joining in later cycles, it became increasingly difficult for established countries to make sense of their "ranking" over time. To assist in better evaluating changes within a country, the OECD developed an index of annualized change in performance that takes into account the number of years between each measure. This was provided in the 2012 International Report (OECD, 2013a) and gave participating countries a robust indicator of internal progress (or decline) over time. According to this index (see Table 4), about half of all countries have improved over time in reading since 2000; a number of countries have seen a decrease in mathematics performance since 2003; and a majority of countries have remained stable in science since 2006. Table 4: Number of countries and economies and change in average score over time (PISA 2000-2012) Reference Year (2012) Reading 2012 compared to 2000 Mathematics 2012 compared to 2003 Science 2012 compared to 2006 + 32 25 19 = 22 25 37 10 14 8 Note: (author's calculations) + Number of countries and economies where average score has increased over time = Number of countries and economies where average score has not changed significantly over time - Number of countries and economies where average score has decreased over time As a result of the country's high performance since 2000, reactions to PISA results have been more muted in Canada than in many other countries. This has extended to the policy area, where policy makers have steered clear of making drastic changes based on limited data or research (Hess, 2008). However, provinces that did not fare as well as expected have introduced some moderate initiatives, based in part on PISA data and which could be characterized as fine-tuning an already strong system. New Brunswick reconsidered its French-immersion program (Dicks, 2008; Cooke, 2009) and Ontario launched both its Literacy andNumer- 79 ŠOLSKO POLJE, LETNIK XXV, ŠTEVILKA 5-6 acy Initiative (to improve reading and mathematics results at the primary level) and its Student Success Initiative (to increase the high school graduation rate) (OECD, 2011a). A recent decline in national results, however, may stimulate a stronger response. Described in some quarters as "a national emergency" (Globe and Mail, 2013), the weakening of mathematics skills evident in PISA 2012 has created calls for immediate action in two areas: the training of teachers in mathematics and a review of the content of provincial mathematics curricula. Teacher training was, interestingly enough, often cited as a reason for Canada's strong results in early PISA cycles (OECD, 2004). More recently, however, the international Teacher Education and Development Study in Mathematics (TEDS-M), administered in 2008 (CMEC, 2010), pointed out that many Canadian elementary-level teachers lacked knowledge in mathematics and mathematics pedagogy, while those at the lower-secondary level lacked training in assessment. The study also noted that a smaller proportion of mathematics educators (those teaching future teachers) in Canadian universities were specialized in mathematics at the doctoral level compared to the international average. Provincial mathematics curricula came under scrutiny from a number of observers for their emphasis on "new math." This approach lies at the heart of mathematics curricula in most of Canada (with Quebec, whose mathematics scores greatly outranked the rest of the country, a notable exception) and was singled out for favouring discovery learning and problem solving over "basic knowledge and skills" and "daily-use math" (Alphonso, 2013). Lessons Learned At the time of writing this article, several provinces are considering proposals in the two areas of teacher training and curriculum renewal (Alberta Education, 2014; British Columbia Education, 2013; Manitoba Education, 2013; Nova Scotia Education and Early Childhood Development, 2013; Ontario Ministry of Education, 2014), although many initiatives had already been undertaken prior to the release of the PISA 2012 results. Canadian provinces would be well advised to reflect carefully on any reforms they may undertake. Judging by the situation in other countries, there appears to be an inclination to push the panic button and implement reforms based on limited evidence. Too often, causation and correlation are confused when discussing PISA results (Mortimore, 2009), and any outcomes should be validated with other data sources, such as 80 p. BROCHU ■ THE INFLUENCE OF PISA ON EDUCATIONAL POLICY IN CANADA ... other international studies and pan-Canadian or provincial results. Furthermore, the growing use of trend data, rather than reliance on comparative rankings alone, can significantly improve the usefulness of PISA results, in particular in those countries such as Canada, the United States, Germany, Italy, or Spain, where PISA results are available at the regional, state, or provincial/territorial level. The ultimate goal of education should not be to finish first in the PISA race or improve in international rankings. Instead, education should enhance performance levels and equity for all students (Yore et al., 2010). PISA indicators should be used to attain these goals, by being integrated into national/federal policies and practices (Breakspear, 2012), not by replacing them. Reform of Canada's education systems should acknowledge that from an international perspective, the country is still regarded as a model to emulate. It would make little sense to implement major changes to education policies across Canada based solely on PISA results when other countries use the same results to justify emulating Canada. Canada also benefits from a very large sample size in PISA, which allows results to be analysed at a fine-grained level. Canada's provinces can thus learn not only from the experience of other countries, but also from their neighbouring provinces (Wallner, 2013). In the case of the most recent mathematics results, for example, Quebec appears to have much to impart to the rest of the country, as its results place it among the highest of all PISA participants in the world. Not only is it one of those jurisdictions that elected not to completely redesign its mathematics curriculum to integrate discovery learning into the program of study (Alphonso, 2013), it is also the province with the most mathematics teachers who are actually specialized in teaching mathematics (CMEC, 2012) and the only province where the proportion of low achievers in mathematics has not increased over the past nine years (Brochu et al., 2013). As another example, students in British Columbia have achieved sustained high performance in reading, science, and problem solving in recent PISA cycles, and that province's on-going curriculum review is cited as one of the reasons for their success (Sherlock, 2014). This paper has attempted to analyse the impact of PISA in several countries where PISA results have garnered considerable attention over the past decade. It argues that silver bullets based on PISA results are not only unrealistic but should be avoided (Hargreaves & Shirley, 2012). Education systems are complex entities requiring a thoughtful, systematic, and balanced approach to reform (Sahlberg, 2011). 81 ŠOLSKO POLJE, LETNIK XXV, ŠTEVILKA 5-6 References Alberta Education (2014) Curriculum redesign - Letter from the Minister. Available from: http://education.alberta.ca/department/ipr/curric-ulum/ministercrletter.aspx (Accessed 14th March 2014). Alphonso, C. (2013, 3rd December) Canada's falling math rankings renews push for national standards. Globe and Mail. Available from: http://www.theglobeandmail.com/news/national/education/cana-das-falling-math-rankings-renews-push-for-national-standards/arti-clei5755 4 3 4/ (Accessed 3rd December 2013). American Psychological Association (2010) Today's superheroes send wrong image to boys, say researchers [Press release]. Available from: http:// www.apa.org/news/press/releases/2010/08/macho-stereotype-un-healthy.aspx (Accessed 11th December 2014). Ammermüller, A. (2005) PISA: What makes the difference? Explaining the gap in PISA test scores between Finland and Germany. Discussion Paper No. 04-04. Centre for European Economic Research. Available from: ftp://ftp.zew.de/pub/zew-docs/dp/dp0404.pdf (Accessed 11th December 2014). Anderson, J. O., Chiu, M.-H., and Yore, L.D. (2010) First cycle of PISA (2000-2006) - International perspectives on successes and challenges: Research and policy directions. International Journal of Science and Mathematics Education. 8, pp. 373-388. Anonymous (2007, December 21) PISA 2006 - The politics of testing. Update. 35 (4). Toronto: Ontario Secondary School Teachers' Federation. Baird, J.-A., Isaacs, T., Johnson, S., Stobart, G., Yu, G., Sprague, T., and Daugherty, R. (2011) Policy effects of PISA. Oxford University Centre for Educational Assessment. Available from: http://oucea.edu-cation.ox.ac.uk/wordpress/wp-content/uploads/2011/10/Policy-Ef-fects-of-PISA-OUCEA.pdf (Accessed 11th December 2014). Breakspear, S. (2012) The policy impact of PISA. An exploration of the normative effects of international benchmarking in school system performance. OECD Education Working Papers, No. 71. Paris: OECD Publishing. British Columbia Education (2013) B.C. students among best internationally. [Press release]. Available from: http://www.newsroom.gov.bc. ca/2013/12/bc-students-among-best-internationally.html (Accessed 11th December 2014). Brochu, P., Deussing, M-A., Houme, K., and Chuy, M. (2013) Measuring up: Canadian results of the OECD PISA Study. The performance of 82 p. BROCHU ■ THE INFLUENCE OF PISA ON EDUCATIONAL POLICY IN CANADA ... Canada's youth in mathematics, reading and science. 2012 first results for Canadians aged 15. Council of Ministers of Education, Canada. Chakrabarti, R. (2013, 2nd December) South Korea's schools: Long days, high results. BBC News. Available from: http://www.bbc.co.uk/ news/education-25187993 (Accessed 11th December 2014). Cooke, M. (2009) PISA 2009: It's not about the Leaning Tower. New Brunswick Teachers' Association. Available from: http://www.nbta.ca/ profession/curriculum/financei4.html (Accessed 11th December 2014). Council of Ministers of Education, Canada (2010) Teacher education and development study in mathematics 2008: Canadian report. Toronto: Author. Council of Ministers of Education, Canada (2012) Contextual report on student achievement in mathematics. Toronto: Author. Coughlan, S. (2013, December 3) Pisa tests: UK stagnates as Shanghai tops league table. BBC News. Available from: http://www.bbc.com/ news/ (Accessed 11th December 2014). Dalporto, D. (2013) South Korea's school success. We are teachers. Available from: http://www.weareteachers.com/hot-topics/special-re-ports/teaching-around-the-world/south-koreas-school-success (Accessed 11th December 2014). Dicks, J. (2008) The case for early French immersion: A response to J. Douglas Willms. Second Language Research Institute of Canada: University of New Brunswick. Available from: www.unbf.ca/L2 (Accessed 11th December 2014). Drager, J. (2013) Accountability as a driver for reform: The "PISA shock"of 2001 - A spotlight on the case of Germany. Harvard University, July 26th, 2012. Available from: http://www.hks.harvard.edu/pepg/con-ferences/July_20i2_Presentations/Drager_Panel%20l.pptx (Accessed 11th December 2014). Duncan, A. (2013) The threat of educational stagnation and complacency. Remarks of U.S. Secretary of Education Arne Duncan at the release of the 2012 Programme for International Student Assessment (PISA) Available from: http://www.ed.gov/news/speeches/threat-educa-tional-stagnation-and-complacency (Accessed 11th December 2014). Entorf, H., and Minoiu, N. (2005) What a difference immigration policy makes: A comparison ofPISA scores in Europe and traditional countries of immigration. German Economic Review. 6(3), pp.355-376. Erth, H. (2006) Educational standards and the changing discourse on education: The reception and consequences of the PISA study in Germany. Oxford Review of Education. 32 (5), pp. 619-634. 83 ŠOLSKO POLJE, LETNIK XXV, ŠTEVILKA 5-6 Figazzolo, L. (2009) Impact of PISA 2006 on the education policy debate. Education International. Available from: http://download. ei-ie.org/docs/IRISDocuments/Research%20Website%20Docu-ments/2009-00036-0i-E.pdf (Accessed 11th December 2014). Froese-Germain, B. (2010) The OECD, PISA and the impacts on educational policy. Canadian Teachers' Federation. Available from: http:// www.ctf-fce.ca (Accessed 11th December 2014). Grek, S. (2009) Governing by numbers: the PISA 'effect' in Europe. Journal of Education Policy. 24 (1), pp. 23-37. Hargreaves, A., and Shirley, D. (2012) The international quest for educational excellence: Understanding Canada's high performance. Education Canada, Fall 2012, pp. 10-13. Hess, F.M. (2008) The politics of knowledge. The Phi Delta Kappan. 98 (5), pp. 354-356. Levin, B. (2012) Greater equity in education. Phi Delta Kappan. 93 (7), pp.74-76. Lloyd, M. (2012, May 14) What we can learn from South Korea? Edutech Associates. Available from: http://edutechassociates.net/2012/05/14/ what-can-we-learn-from-south-korea (Accessed 11th December 2014). Malaty, G. (2012) PISA results and school mathematics in Finland: strengths, weaknesses andfuture. Available from: http://www.aka.fi/ fi/A/Suomen-Akatemia/Blogit/Markku-Leskela-PISA-ja-olympi-alaiset/Asia-on-tutkittu-httpmathunipait~grim2i_project2i_char-lotte_MalatyPaperEditpdf/ (Accessed 11th December 2014). Manitoba Education (2013, 3rd December) Province's smaller classes initiative results in 130 new teachers in schools this year. [Press release] Available from: http://news.gov.mb.ca/news/index.html?ar-chive=&item=i9797 (Accessed 11th December 2014). Marginson, S. (2013, 16th December) Australia falls further behind in PISA test of basic education skills. Higher Education. Available from: http://www.theaustralian.com.au/higher-education/australia-falls-further-behind-in-pisa-test-of-basic-education-skills/story-e6frgc-jx-1226783857196 (Accessed 11th December 2014). Martens, K., and Niemann, D. (2010) Governance by comparison — How ratings & rankings impact national policy-making in education. TranState Working Papers, No. 39. Available from: http://www.staatlich-keit.uni-bremen.de (Accessed 11th December 2014). Mead, S. (2008) How Finland educates the youngest children. The Early Ed Watch Blog. Available from: http://www.newamerica.net/blog/ early-ed-watch/2008/how-finland-educates-youngest-children-9029 (Accessed 11th December 2014). 84 p. BROCHU ■ THE INFLUENCE OF PISA ON EDUCATIONAL POLICY IN CANADA ... Merry, J.J. (2013) Tracing the U.S. deficit in PISA reading skills to early childhood: Evidence from the United States and Canada. Sociology of Education. 86 (3), pp. 234-252. Mortimore, P. (2009) Alternative models for analysing and representing countries' performance in PISA. Education International Research Institute. Brussels. Ninomiya, A., and Urabe, M. (2011) Impact of PISA on education policy -The case ofJapan. Pacific-Asian Education. 23 (1), pp. 23-30. Nova Scotia Education and Early Childhood Development (2013) Nova Scotia students perform well in international assessments [Press release]. Available from: http://novascotia.ca/news/release/?id=20i3i20300i (Accessed 11th December 2014). Neumann, K., Fisher, H.E., and Kauertz, A. (2010) From PISA to educational standards: The impact of large-scale assessments on science education. International Journal of Science and Mathematics Education. 8, pp. 545-563. OECD (2004) What makes school systems perform? Seeing school systems through the prism of PISA. Paris: OECD Publishing. OECD (2010) PISA 2009 results: What students know and can do. Student performance in reading, mathematics and science (Volume I). Paris: OECD Publishing. OECD (2011a) Strong performers and successful reformers in education lessons from PISA for the United States. Paris: OECD Publishing. OECD (2011b) Education: Bridging the classroom divide. OECD Observer. No. 284. OECD (2013a) PISA 2012 results: What students know and can do (Volume I): Student performance in mathematics, reading and science. Paris: OECD Publishing. OECD (2013b) PISA 2012 results: Excellence through equity: Giving every student the chance to succeed (Volume II). Paris: OECD Publishing. OECD (2013c) Programme for International Student Assessment (PISA) results from PISA 2012. Country note: Germany. Available from: http://www.oecd.org/pisa/keyfindings/PISA-2012-results-germany. pdf (Accessed 11th December 2014). OECD (2013d) Programme for International Student Assessment (PISA) results from PISA 2012. Country note: Spain. Available from: http:// www.oecd.org/pisa/keyfindings/PISA-2012-results-spain.pdf (Accessed 11th December 2014). Ontario Ministry of Education (2014) New math supports and resources for the classroom Ontario government committed to student success [Press release]. Available from: http://news.ontario.ca/edu/ 85 ŠOLSKO POLJE, LETNIK XXV, ŠTEVILKA 5-6 en/2014/0l/new-math-supp0rts-and-res0urces-f0r-the-classr00m. html (Accessed 11th December 2014). Paine, S.L., and Schleicher, A. (2011) What the U.S. can learn from the world's most successful education reform efforts. Policy Paper: Lessons from PISA. McGraw-Hill Research F0undati0n. Available from: http://hub.mspnet.0rg/index.cfm/22436 (Accessed 11th December 2014). Pears0n (2012) The Learning Curve: Lessons in country performance in education. 2012 Report. Available from: www.thelearningcurve.pears0n. c0m (Accessed 11th December 2014). R0belen, E. (2013, 18th December) Finland's gl0bal standing in educati0n takes hit with latest PISA results. Education Week. Available from: http://bl0gs.edweek.0rg/edweek/curriculum/2013/12/educati0nal_ t0urism_t0_finland.html (Accessed 11th December 2014). Sahlberg, P. (2011) PISA in Finland: An educati0n miracle 0r an 0bstacle t0 change? CEPS Journal. 1 (3), pp.119-140. Sherl0ck, T. (2014, April 1). B.C. students am0ng best in w0rld at problem s0lving but 'w0rk' remains. The Vancouver Sun. Available from: http://www.vanc0uversun.c0m/techn0l0gy/students4am0n-g+best+w0rld+pr0blem+s0lving+w0rk+remains/9692940/st0ry. html (Accessed 11th December 2014). UNESCO (2000) The w0rld educati0n rep0rt 2000. The right to education: towards education for all throughout life. UNESCO, Available from: http://www.unesc0.0rg/educati0n/inf0rmati0n/wer/ (Accessed 11th December 2014). Valijarvi, J., Linnakyla, P., Kupari, P., Reinikainen, P., and Arffman, I. (2002) The Finnish success in PISA - and some reasons behind it: PISA 2000. Finnish Institute for Educati0nal Research. University 0f Jy-vaskyla. Wallner, J. (2013, 6th December) Math sc0res are d0wn. D0n't panic. Ottawa Citizen. Available from: http://www.0ttawacitizen.c0m/ 0pini0n/0p-ed/Math+sc0res+d0wn+panic/92 6i5i0/st0ry.html (Accessed 11th December 2014). Y0re, L.D., Anders0n, J.O. and Chiu, M. (2010) M0ving PISA results int0 the p0licy arena: Perspectives 0n kn0wledge transfer for future c0n-siderati0ns and preparati0ns. International Journal of Science and Mathematics Education. 8, pp. 593-609. 86