REVIJA ZA ELEMENTARNO IZOBRAŽEVANJE JOURNAL OF ELEMNTARY EDUCATION Vol. 12, No. 2, pp. 155-176, Junij 2019 The National Assessment of Mathematics in High Schools in Italy with Slovene as the Language of Instruction Daniel Doz Liceo Scientifico Statale »France Prešeren« with Slovene teaching language Korespondenčni avtor/Corresponding author doz_daniel@yahoo.it Povzetek/ Abstract In the following paper, we analyzed the question types in the INVALSI national mathematics assessments in Italian high schools with Slovene as the language of instruction. Through a statistical analysis, we found that closed-type questions were more frequent than open-type questions. A greater presence of closed-type questions could lead to the issue of guessing and cheating. Moreover, a greater quantity of closed-type questions could lead to a partial evaluation of knowledge, since procedures and other mathematical competences are not considered. In our research, we also considered the topics of the questions that were presented in national assessments. Nacionalni preizkus znanja iz matematike na višjih šolah s slovenskim učnim jezikom v Italiji V prispevku predstavljamo tipologijo vprašanj v vsedržavnem preverjanju znanja iz matematike INVALSI. S pomočjo statistične analize lahko ugotovimo, da so vprašanja zaprtega tipa pogostejša od vprašanj odprtega tipa. Prisotnost večjega števila vprašanj zaprtega tipa lahko privede do problema ugibanja in prepisovanja. Poleg tega pa lahko večje število vprašanj zaprtega tipa privede to nepopolnega preverjanja znanja, saj se pri teh vprašanjih ne oceni postopkov in drugih matematičnih kompetenc. Potrjeno/Accepted 27. 2. 2019 Objavljeno /Published 28. 6. 2019 Keywords: mathematics, Slovene education in Italy, national examinations, testing Ključne besede: matematika, slovensko šolstvo v Italiji, nacionalni preizkus znanja, testiranje UDK/UDC 37.091.276:51(450) DOI https://doi.org/10.18690/rei.12.2155-176.2019 Besedilo / Text © 2019 Avtor(ji) / The Author(s) To delo je objavljeno pod licenco Creative Commons CC BY Priznanje avtorstva 4.0 Mednarodna. Uporabnikom je dovoljeno tako nekomercialno kot tudi komercialno reproduciranje, distribuiranje, dajanje v najem, javna priobčitev in predelava avtorskega dela, pod pogojem, da navedejo avtorja izvirnega dela. (https://creativecommons.org/licenses/by/4.0/). lln—H mm University of Maribor Press 156 REVIJA ZA ELEMENTARNO IZOBRAŽEVANJE/ JOURNAL OF ELEMENTARY EDUCATION Introduction National assessments have the purpose of evaluating learning outcomes, which are based on criteria and expectations set by various national education authorities (Benavot & Tanner, 2008). These assessments should describe the level of achievement and competences of the education system in its totality, or a specific part of it, such as 11-year-old pupils (Kellaghan & Greaney, 2001). National assessments also play an important role in providing national policymakers with objective information about the status of the education system (Benavot & Tanner, 2008). Since they are typically subject-oriented (mathematics, language, science etc.), and since they evaluate a particular grade level, they are an important diagnostic tool, used to identify which areas in the school systems need more attention (ibid.). Discussions regarding standardized tests have long been part of the overall concern in relation to their ambiguity and validity (Powell & Gillespie, 1990). Despite this, the spread and growth of such tests has not diminished in recent years. On the contrary, such tests are still very much in use, and the growing concern is that they are no longer a true representation of a student's knowledge, but focus instead only on the techniques needed to pass the tests. In fact, as Roberts (2006) suggests, learning has been focused on the assessments themselves, on "passing tests", as it were. Assessments are usually divided into two main categories: formative and summative (Garrison & Ehringhaus, 2010). Summative assessments are periodic tests of student knowledge. Their aim is to measure students' knowledge and acquisition of standard curricular content. They are also tools to evaluate the effectiveness of curricula and school programs (Leung, Leung & Zuo, 2014). Examples of such summative assessments, as Garrison & Ehringhaus (2010) explain, include state assessments, end-of-unit or chapter tests, semester exams and other types of tests. According to the authors, this kind of assessment is helpful for the school and teacher, specifically when they need to make instructional adjustments and interventions. Formative assessments, on the other hand, can be used in classroom practice as a tool to understand how to adjust both teaching and learning. This kind of assessment can be formal or informal; in both, it is nevertheless important that the teacher give students feedback that shows the presence of a 'gap' between the level of the assessed work and the required standard (Taras, 2005). D. Doz: The National Assessment of Mathematics in High Schools in Italy with Slovene as the Language of Instruction 157 The Italian national examination of knowledge, INVALSI, is a summative assessment. Since 2010, grade 10 students in Italian high schools are supposed to take an examination in both mathematics and the Italian language (Italian is replaced by Slovene in schools with Slovene as the language of instruction). The questions in these assessments can be either open- or closed-type. There are several reasons for preferring one type to the other: for example, closed-type questions are easier to grade; on the other hand, open-type questions do evaluate students' knowledge and abilities in a more complete way. When students answer closed-type questions, there is a greater chance of guessing, since in the INVALSI examinations, there are no penalties for answering a question incorrectly. In order to establish which question type (open- or closed-type) was more frequent in the INVALSI assessment of knowledge of mathematics over the last 8 years, we decided to conduct a study based on seven INVALSI mathematics assessments for grade 10 students in Italian high schools. We were additionally interested in finding out which mathematical topic was more common in the INVALSI national assessments of mathematical competence. In particular, we wanted to understand whether there had been a shift in interest over the last three years from more theoretical topics (functions, equations, radicals etc.), to more applied topics (statistics, data representation, modeling with linear functions etc.). If some mathematical topics were being neglected by the national assessment of knowledge, the issue could emerge of more marginal students' knowledge of certain mathematical content, which could lead to mathematical illiteracy and to the impossibility of a deeper understanding of mathematical topics. In fact, teachers that base their programs on assessment of knowledge could omit some of the more theoretical content and concentrate only on those mathematical topics that are somehow "hot" and statistically more common in the examinations. Theoretical Framework Which evaluation? In education, evaluation is necessary in order to gauge what students understand and what they can do (Kartianom & Mardapi, 2017). National evaluations are done to measure the level of knowledge and competence of students, as well as to diagnose the status of the school system (Bansilal, 2017; Kartianom & Mardapi, 2017), and the national competences and achievements in specific subjects (Sulistyaningsih & Sugiman, 2016). National assessments, which are equal for the whole population, guarantee objective information about student knowledge (Cankar, 2008). 158 REVIJA ZA ELEMENTARNO IZOBRAŽEVANJE/ JOURNAL OF ELEMENTARY EDUCATION Assessment practices are rapidly transforming, since today we use more open-ended problems, hands-on problems, essays and information technology (such as computer simulations of real-world problems) (Linn, Baker & Dunbar, 1991; Stecher et al., 1998). On the other hand, some national assessments also include closed-type questions, such as multiple-choice questions, as can be seen in the INVALSI examination (Quadro di Riferimento, 2017). The INVALSI assessments in mathematics do not take into account open-ended problems, or essays. There are no hands-on problems that might be evaluated just by solving the INVALSI assessment of knowledge. Hence, if we compare the structure of the INVALSI examinations with the ideas present in Linn, Baker & Dunbar (1991) and in Stecher et al. (1998), we could conclude that the INVALSI national assessments do not evaluate students' abilities, knowledge and competences in the ways described by the authors, since they diagnose only the student's ability to solve a certain type of problem. As stated by Cankar (2008), some national examinations assess only the cognitive achievements of students, and only in specific subjects. In the case of the Italian assessment, the INVALSI examinations evaluate only student knowledge of mathematics and languages. Nevertheless, national assessments of mathematics might help teachers and educators to modify, and hence improve, their teaching (Felda, 2018). Moreover, national assessments are useful in identifying the strong and weak points of teaching mathematics, and they help to monitor the developing factors in education, such as lesson programs, textbooks and teachers' learning and training (Magajna & Zakelj, 2011; Zakelj, Ivanus Grmek, Cankar, 2012). Parveva, De Coster & Noorani (2009) stated that national assessments can be divided into three groups: - in the first group we consider all the national assessments that have the goal of grading students' knowledge: the results of national assessments also have an impact on the grades of individual students. This kind of assessment also has an impact on the future choices of students, determining, for example, which school a student can attend; - in the second group, there are national assessments that are done mainly to assess the school system and identify reforms regarding schooling; D. Doz: The National Assessment of Mathematics in High Schools in Italy with Slovene as the Language of Instruction 159 - in the third group, we consider those assessments which are a supplement to teaching processes. Based on the results of the examinations, educators and schools can identify their weeks points and teaching needs, while seeking improvements. In this category, we could place the national assessments in Italy. Some authors are concerned by the fact that students who are not motivated perform worse in national and international assessments than those who are motivated (O'Neil Jr., Sugrue & Baker, 2010). In this respect, Cankar (2008) states that national assessments may seem to many a waste of time, since they are "not for a grade". In Italy, the national assessment of knowledge does contribute partly to the final grade in grade 9 (INVALSI, 2010), but with the new school reform, the INVALSI examinations have only a diagnostic function and not a grading one (Studenti, 2019a, 2019b). In grades 10 and 13, the INVALSI examinations are obligatory, but they do not influence the final grading of the student (Studenti, 2019; Studentville, 2019). Considering these affirmations and the conclusions of Cankar (2008), we can observe that many students are demotivated while sitting the INVALSI assessment tests, which is congruent with what was stated by the Italian educator Daniele Novara (Repubblica, 2018). Open- and closed-type questions Stankous (2016) points out that the issue of measuring student performance has often been at the center of several debates. In order to evaluate students' knowledge, teachers can prepare tests with different typologies of questions: open (constructed-response) and closed (selected-response) questions. In open questions, students must construct their own responses, organizing and applying their knowledge (Powell & Gillespie, 1990). Preparing such questions is easy, while grading them seems to be much more difficult, since clear criteria and scoring tables are difficult to prepare (ibid.). In these types of questions, there is less chance of students answering by guessing. In closed questions, on the other hand, students need to select the answers among various alternatives. Grading such questions is much easier and faster, but their preparation is time consuming. Guessing in closed-type questions is also an important issue (Klufa, 2015). Students, who face a multiple-choice question with 4 options, have a 25% chance of guessing the correct answer. If there is no penalty for a wrong answer, students are more likely to guess a correct answer, in order to collect more points (Espinosa & Gardeazabal, 2010). 160 REVIJA ZA ELEMENTARNO IZOBRAŽEVANJE/ JOURNAL OF ELEMENTARY EDUCATION National assessments can have both open- and closed-type questions. Scoring assessments with multiple-choice questions is cheaper than those with judgement-based tasks, but the gains to student learning are greater (Wiggins, 1990). Similarly, Stankous (2016) has shown that constructive-response tests do more to encourage student learning than multiple-choice tests. The author affirms that open-type tests are more reliable and valid than closed-type tests and that student success cannot be measured by multiple-choice tests alone. In her research, the author found that many teachers want their students to recognize questions and question types, memorizing the correct answers, so that they can "meet certain educational performance standards" (ibid., p. 315), forgetting this information when the tests are over. Conversely, Roberts (2006) thinks that multiple-choice tests could be used to enhance the learning process. On the other hand, students might leave the multiple-choice testing having assimilated false or incorrect knowledge, (Roediger III & Marsh, 2005). We would like to present an example of the application of concerns voiced by Stankous (2016) about mere memorization, an example taken from the INVALSI examinations. Example 1: Example of a multiple-choice question form the INVALSI examinations. School year 2010-11 Term it's the same: A. □ B. □ C. □ D. □ School year 2011-12 Term it's the same: A. □ B. □ C. □ D. □ School year 2014-15 Term it's the same: A. □ B. □ C. □ D. □ In Example 1, we note that the structure (or the "Question type", Stankous, 2016) is similar, and the answers are constructed on the same basis; the numbers involved are also often repeated. Students could just "recall" the answer form previous exercises and "drills", forgetting how to get to the correct answer. D. Doz: The National Assessment of Mathematics in High Schools in Italy with Slovene as the Language of Instruction 161 Topics in Mathematics Assessment Mathematics literacy is defined by the OECD (2018, p. 51) as follows: "an individual's capacity to formulate, employ and interpret mathematics in a variety of contexts. It includes reasoning mathematically and using mathematical concepts, procedures, facts and tools to describe, explain and predict phenomena. It assists individuals to recognise the role that mathematics plays in the world and to make the well-founded judgments and decisions needed by constructive, engaged and reflective citizens". Following this basic concept, there are four categories of mathematical knowledge and the mathematics program that are assessed in the PISA and PISA-D. These are (OECD, 2018): • change and relationships; • space and shape; • quantity; • uncertainty and data. The OECD document (ibid.) states that the proposed problems should be challenging and based on real situations. The categorization of the content into the four presented categories is important for the development and selection of items, but some problems might be transversal and would thus fit into more than one category: for example "space and shape", "change and relationships" and even "quantity". The interdisciplinarity of the proposed problems is also important, in order to underline the coherence of mathematics as a discipline (ibid.). The situation in Italy: the national INVALSI examinations Italian schooling comprises five school levels: kindergarten, primary school (five years, from level 1 to 5), middle school (three years, from level 6 to level 8), high school (five years, from level 9 to level 14), and university. After the third year of middle school, students are supposed to pass a state exam, in order to proceed to a high school; after the fifth year of high school, students must sit the "Esame di stato" (State Exam) to get their diploma. The National System for Evaluation (Sistema Narionale di Valutarione) works inside the National Institute for the Educational Evaluation of Instruction and Training (INVALSI). The INVALSI also works under the supervision of the Ministry for 162 REVIJA ZA ELEMENTARNO IZOBRAŽEVANJE/ JOURNAL OF ELEMENTARY EDUCATION Education, University and Research (Ministero deU1stru%iom, Universita e Ricerca, MIUR); its aim is to investigate and periodically assess student knowledge on the whole Italian national territory, as proposed in the decree D.Lgs. n. 286/2004. In the directive of the MIUR 76/2009, INVALSI must assess the level of knowledge among primary school pupils, as well as middle school and high school students. Student knowledge is measured through standardized tests of Italian language and mathematics (Martignone, 2016; Quadro di Riferimento, 2017). As written in the Quadro di Riferimento (2017), these tests are currently applied in the second and fifth years of primary school (levels 2 and 5), the third year of middle school (level 8) and the second and fifth years of high school (level 10 and 13). INVALSI prepares the examinations considering the curricula for primary and secondary schooling, i.e. "Indicazioni Nazionali per il curricolo della scuola deWinfanzia e delprimo ciclo di istru%ione" and the "Indica%ioni Nazionali e Linee guidaper le scuole secondarie di secondo grado" (Martignone, 2016; Quadro di Riferimento, 2017). The decree MIUR-MEF n. 211 from the 7th October 2010 and the directive MIUR n. 57 from 15th July 2010 are the two documents that regulate the curricula in Italian high schools. In these documents it is stated that students, at the end of five years' schooling, should know the basic concepts and methods used in mathematics, not only from a theoretical point of view, but also to model and describe various phenomena from the real world. It is also stresses that students should be able to use logical and coherent argumentation and apply mathematical concepts in everyday life. In these documents it is also stated that, while formulating problems for students in high schools, it is important to show them the connections between theoretical knowledge and other sciences (economics, sociology, technology, physics, biology, etc.) or the real world. The documents invite teachers to show students how to use formal language and how to prove theorems, how to analyze data and predict the evolution of phenomena, use mathematical knowledge in other sciences, introduce new concepts using elements form the history of mathematics, history of science, technology and cultural development. During the second year of high school, students at all Italian high schools are required to write a national assessment of Italian language and mathematics. However, these standardized tests: - cannot evaluate students' metacognitive or non-cognitive achievements, such as that embodied in "the students develop a positive attitude toward mathematics" (Quadro di Riferimento, 2017); - cannot evaluate students' ability to argue, prove or solve certain problems (that would require more time or a greater number of steps), formulate D. Doz: The National Assessment of Mathematics in High Schools in Italy with Slovene as the Language of Instruction 163 hypotheses, or model real-world situations and analyze them from a mathematical point of view (Quadro di Riferimento, 2017); - are objective; hence, they do not take into account the affective and conative aspects, which are also important in evaluating student work and assessing competence (Quadro di Riferimento, 2017). Until the school year 2017-18, the INVALSI examinations for schools with Slovene as language of instruction were printed. Every year, the printed dossier had a variable number of questions, which were divided into different items (Quadro di Riferimento, 2017). Questions in the INVALSI examinations, as expressed in the Quadro di Riferimento (2017), can be - closed: multiple choice questions, where there are four alternatives (only one answer is correct); true-false questions, composed of several subquestions (see Example 2); - short, open questions: these require a simple, rapid open answer, such as the result of a computation, or some graphic answers (see Example 3); - open: these require simple argumentation or short computations; - cloze: the student is required to complete a sentence, computation or expression. Example 2: Example of a multiple-choice question from the INVALSI examination for the school year 2017-18. The result is the same: A. □ B. □ C. □ D. □ Example 3: Example of a short answer question from the INVALSI examination for the school year 2017-18. The equation is given , with unknown in real number . The equation solution is , if .......... Students were allowed to use their calculators during the tests, but not devices that could connect to the internet, wireless or Bluetooth. They could also use compasses, rulers and goniometers. 164 REVIJA ZA ELEMENTARNO IZOBRAŽEVANJE/ JOURNAL OF ELEMENTARY EDUCATION In the Quadro di Riferimento (2017) it is also stated that simple language must be used in writing the problems proposed to the students; no dialectal or regional expressions are used, and the testers try to avoid useless technological jargon. Pictures are used only when particularly explicative; the data provided in the problems are mainly taken from real data and statistics. Questions are also equally distributed among the various topics. The proposed problems have two dimensions: the cognitive dimension and the topic dimension. Topics are divided into four categories: numbers, geometry, algebra and data analysis (and probability). The cognitive dimension is divided into three categories: - knowing: the student understands the facts, concepts and procedures; - applying: the student should know how to apply their knowledge and acquired concepts to solve problems and answer questions; - reasoning: the student solves problems related to complex and unfamiliar contests. Empirical Research Aim of the Research The aim of the present research is to analyze several INVALSI examinations in mathematics in the Slovene language, in order to understand the typology of their questions and the mathematics field to which each question is related. This research could have practical applications: teachers and students could be informed of the mathematical topics that are more common on the INVALSI examinations and the question types. On the other hand, this research could show which knowledge and competence in mathematics Italy requires from its students. We wanted to investigate whether there have been changes in the types of questions and the interest field from 2011 to 2017. And, if so, what these differences were. Our research questions were the following: - Is there a prevalent typology for questions on the INVALSI examinations? Are there more closed-type questions (e.g. true-false questions or multiple-choice questions) or open-type questions (e.g. short answer questions)? D. Doz: The National Assessment of Mathematics in High Schools in Italy with Slovene as the Language of Instruction 165 - Is there any mathematical topic that appears more often in the INVALSI examinations than others? And if so, has there been any change in the topics of interest over the years? - Have theoretical fields, such as functions, set theory and logic, become gradually less common on the INVALSI mathematics examinations? Methodology Research method In the research, we used the descriptive statistical method and the non-experimental method for causal analysis. We decided to use these methods, because they are best suited to answer the research questions. Statistical sample In the research we considered seven INVALSI mathematics examinations for the second year of high schools with Slovene as the language of instruction, i.e. all the examinations from the school year 2010—11 to the school year 2016-17. We omitted from our sample the INVALSI examination from the school year 2017-18 because it was almost identical to the one from 2013—14. The samples cannot be found online, since they are prepared ad hoc for schools with Slovene as the language of instruction; the samples in Italian can be found on the website https://www.engheben.it/prof/materiali/invalsi/seconda_superiore_matematica. htm. These versions are similar to those in Slovene. Analysis of the Data We first looked at the types of questions appearing on the various INVALSI examinations. Next, we sought to identify the mathematical topic to which the question referred, in order to understand which topic is the most popular. The collected data was analyzed using the descriptive statistical method, expressing frequency of appearance. All data was analyzed with the help of the statistical software Jamovi. 166 REVIJA ZA ELEMENTARNO IZOBRAŽEVANJE/ JOURNAL OF ELEMENTARY EDUCATION Results and Discussion The Typology of the Exercises In Table we present the typology of the various questions on the INVALSI examinations from the school year 2010—11 to the year 2016—17. In the table we present the number of questions (and sub-questions) for each type: multiple-choice, true-false, long answer questions (a procedure is evaluated and scored), fill-in the blank, short answers (only a numerical answer is required, no procedure is evaluated), connect the terms, and the total number of questions on the examination. In brackets, we give the percentage of that type for that year's examination. Table 1: Typology of exercises in various school years. School year Multiple-choice True-false Long answer Fill-in the blank/ complete Short answer Multiple-choice and discussion Connect Total 2010-11 23 3 3 1 11 0 0 41 (56.1%) (7.3%) (7.3%) (2.4%) (26.8%) (0.0%) (0.0%) (100.0%) 2011-12 21 (46.7%) 4 (8.9%) 2 (4.4%) 4 (8.9%) 11 (24.4%) 3 (6.7%) 0 (0.0%) 45 (100.0%) 2012-13 28 (63.6%) 2 (4.6%) 0 (0.0%) 1 (2.3%) 11 (25.0%) 2 (4.6%) 0 (0.0%) 44 (100.0%) 2013-14 13 (34.2%) 6 (15.8%) 0 (0.0%) 2 (5.3%) 16 (42.1%) 1 (2.6%) 0 (0.0%) 38 (100.0%) 2014-15 18 (42.9%) 4 (9.5%) 0 (0.0%) 2 (4.8%) 16 (38.1%) 2 (4.8%) 0 (0.0%) 42 (100.0%) 2015-16 17 (42.5%) 3 (7.5%) 0 (0.0%) 3 (7.5%) 15 (37.5%) 2 (5.0%) 0 (0.0%) 40 (100.0%) 2016-17 14 (35.0%) 6 (15.0%) 2 (5.0%) 4 (10.0%) 13 (32.5%) 0 (0.0%) 1 (2.5%) 40 (100.0%) Total 134 28 7 17 93 10 1 290 (46.2%) (9.7%) (2.4%) (5.9%) (32.1%) (3.4%) (0.3%) (100.0%) From Table 1, we can see that the most frequent type of question on the INVALSI mathematics examinations for the second year of high school is the multiple-choice question (46.2%). The second most frequent type is the short answers (32.1%), followed by true-false questions (9.7%). In the last three years, there has not been any significant increase in the frequency of multiple-choice questions, or true-false ones, but it is still clear that the most frequent type of question is the multiple-choice; it is also the most frequent in each school year examination, with the only D. Doz: The National Assessment of Mathematics in High Schools in Italy with Slovene as the Language of Instruction 167 exception being the school year 2013—14, when short answer questions were the most frequent. In order to understand which type of question was the most frequent, we grouped multiple-choice, true-false and connect in the category "closed-type questions", and long answer, short answer, fill-in the blank, and "multiple-choice and discussion" in the category "open-type questions". These results are shown in Table 2. Form the analysis, we see that closed-type questions (56.2%) are more frequent than open-type ones (43.8%). In particular, in the school years 2010-11 and 2012-13, closed-type questions were significantly more frequent than open-type ones, whereas on the other examinations, the frequency was slightly higher. No significant increase in the frequency of closed-type questions can be seen in the last three years. Every year, the number of closed questions was greater than or equal to those of the open-type. Table 2: Open- and closed-type questions on the examinations. School year Closed-type questions Open-type questions Total 2010-11 26 (63.4%) 15 (36.6%) 41 (100.0%) 2011-12 25 (55.6%) 20 (44.4%) 45 (100.0%) 2012-13 30 (68.2%) 14 (31.8%) 44 (100.0%) 2013-14 19 (50.0%) 19 (50.0%) 38 (100.0%) 2014-15 22 (52.4%) 20 (47.6%) 42 (100.0%) 2015-16 20 (50.0%) 20 (50.0%) 40 (100.0%) 2016-17 21 (52.5%) 19 (47.5%) 40 (100.0%) Total 163 (56.2%) 127 (43.8%) 290 (100.0%) As presented, closed-type questions are easier to correct, but they have several limitations. A unavoidable limitation of multiple-choice questions is that it is possible for students to guess the correct answer (Burton, 2001). Furthermore, closed-type questions cannot completely evaluate students' knowledge and abilities. 168 REVIJA ZA ELEMENTARNO IZOBRAŽEVANJE/ JOURNAL OF ELEMENTARY EDUCATION The Mathematical Topics We first sought to understand which mathematical topics are covered by the national INVALSI examination; we therefore analyzed the content of each question and sub-question. From the content of such questions, we decided to distinguish among the following topics: - statistics: sampling, data organization, data representation (tables, graphs, etc.), relative and absolute frequency, percentage frequency, means, standard deviation, variance; - probability: definition of classical probability, events, disjoint events, dependent and independent events, computation of probability of elementary events, a priori and a posteriori probability, binomial tree; - geometry: principal geometrical shapes and objects, definitions, relations between geometrical objects, congruence, parallelism, perpendicularity, geometrical constructions, segments, distance in the plane, measuring with the ruler, angles (internal, external, vertically opposite angles, supplementary, complementary, explementary, etc.), measuring with the goniometer, translations, rotations, symmetries, similarity, Pythagorean theorem, Euclid's theorem and equivalency, Thales' theorem and similarities, perimeters and areas of plane shapes (square, rectangle, triangle, parallelogram, circle, trapezoid, etc.); - arithmetic: natural numbers, integers, rational and real numbers, operations between numbers, properties of the operations, representations of numbers on the number line, fractions, proportions and applications, powers and their properties, square roots and cube roots, decimal and scientific notation, rounding and positions, numerical expressions, symbolic expressions; - approximation: approximation of real and rational numbers, operations with approximated numbers, working with big or small numbers, scientific notation; - diagrams: line diagrams, histograms, bar diagrams, pie charts, representation of numbers, reading diagrams, interpretation of a diagram; - Solid geometry: solids (cube, parallelepiped, cone, sphere, cylinder, pyramid, prism, etc.), volume and surface area, diagonals and sections; - analytic geometry: coordinate plane, points and coordinates of points, distance between points, areas of shapes in the coordinate plane, triangles in the coordinate plane; D. Doz: The National Assessment of Mathematics in High Schools in Italy with Slovene as the Language of Instruction 169 - algebra: unknowns, polynomials, factorization of polynomials and binomials, equations, equalities, inequations and inequalities, squares and square roots of expressions; - percentages: interpretation of percentages, discounts and marketing, application of percentages; - functions: definition of functions, plots of functions, roots of functions, modeling with functions, domain and codomain of functions, polynomial and rational functions, parabola and hyperbola; — linear functions: definition of linear function, plot of linear functions in the coordinate plane, lines, intersection of lines, parallel lines and perpendicular lines, slopes and lines through two points, equation of a line in the coordinate plane, modeling with the linear function, increasing rate; — logic: propositions, logical connectives, negations, conjunctions, disjunctions, implications and equivalences; — set theory: sets, elements, operations with sets (union, intersection, etc.), complementary set, power set, cardinality of a finite set. Our findings are shown in Table 3. In the last column we presented the relative frequency percentage of the topic among all 290 questions analyzed. Table 3: Content of INVALSI examinations. 1 o ^ 2 1 1 1 1 3 1 c^ 1 4 1 1 3 1 5 1 4 1 6 1 1 5 1 7 1 1 6 1 Total o