Supportive climates and science achievement in the Nordic countries: lessons learned from the 2015 PISA study
Large-scale Assessments in Education volume 10, Article number: 12 (2022)
Teacher-student interactions are crucial in understanding the role of a supportive climate in instructional practices. The present study investigates the perceptions of 15-year-old Nordic students regarding four aspects of their science class: teacher support, fairness, feedback, and class discipline. Multilevel modelling analysis is used to examine the extent to which a perceived supportive climate can explain variation in the Nordic students’ science achievements. Overall, the main findings based on The Programme for International Student Assessment (PISA) 2015 data from Denmark, Finland, Iceland, Norway, and Sweden indicate that at the student level, perceived feedback from teachers and students perceiving their teachers as fair explains significant variations in science achievement. The study provides practical and theoretical implications about the importance of strong teacher-student relationships in comprehending the concept of a supportive climate.
The Programme for International Student Assessment (PISA) is an international study by the Organisation for Economic Co-operation and Development (OECD) measuring the academic performance of 15-year-old students in mathematics, science, and reading. In this paper, we examine the extent to which students’ perceptions of a supportive climate explain variations in students’ achievement in science using data from PISA 2015 cycle for the five Nordic countries.
Past research on educational effectiveness has emphasized the importance of instruction and teacher quality for improving educational outcomes (Hattie, 2009). Most frameworks have described three basic dimensions of instructional quality: supportive climate, cognitive activation, and classroom management (Klieme et al., 2009). Supportive climate encompasses several aspects of the teacher-student relationship, such as direct support from teachers in lessons via subject-related directions (e.g. teacher support; Klusmann et al., 2008) and emotional support provided by caring teachers (Pianta & Hamre, 2009), such as dealing with students fairly and having constructive feedback routines. An overlap between the terms teacher support, emotional support, and supportive climate, therefore, seems to be unavoidable for understanding these in terms of teacher practices and instructional quality (Klieme et al., 2009).
Keeping the importance of students’ learning outcomes and motivation in mind, strong links between teacher–student relationships and outcomes have been documented in recent studies (Hattie, 2009; Krane et al., 2017; Pianta & Allen, 2008). Moreover, teacher-directed instruction has been found to be positively associated with performance (Lau & Ho, 2020). Teaching practices conducive to learning (Hattie, 2009), such as providing extra help to students when needed, listening to and respecting students’ input, and caring for and encouraging students, alongside the influence of such practices on learning outcomes necessitate expanding the understanding of a supportive climate. In schools with supportive climate conducive to learning less disciplinary problems are to be observed (Cohen & Geier, 2010). Students benefit more in lessons where discipline norms are understood and implemented in agreement, and teachers are perceived to be fair to individual backgrounds (Krane et al., 2017).
To expand on the dimension of supportive climate, this paper utilizes information from existing theoretical frameworks and literature to operationalize supportive climates by investigating four key aspects: teacher support, fairness, feedback, and class discipline (Fauth et al., 2014, 2019; Jimerson & Haddock, 2015; Klieme et al., 2001; Kunter et al., 2013; Taut & Rakoczy, 2016). The combination of reports on student background variables, student attitudes, and school features facilitates an examination of the multiple aspects of supportive class climate in detail as well as investigates links between student reports on supportive climate and educational outcomes. Moreover, examining the association between student achievement and supportive climates may assist in identifying factors that could improve educational outcomes (Baumert et al., 2010; Fauth et al., 2019; Krane et al., 2017). This is critical because, unlike student background characteristics and peer impact, educators can influence the perceptions of a supportive climate for students through a consistent and conscious effort (Lanahan et al., 2005; Wagner et al., 2016).
This study examines variations in science achievement at the student and school levels using data of five Nordic countries—Denmark, Finland, Iceland, Norway, and Sweden from the PISA 2015 study—in a multilevel modelling (MLM) analysis. We attempt to understand the characteristics of supportive climate in science classes and extend our analysis to assess whether the variation in science achievement explained by supportive climate is similar across the five Nordic educational systems. We have chosen these countries due to their similarities in social, cultural, political, and economic factors. Although the five have independent school systems and their own curricula, the goal of fair education systems where “equity, participation and welfare are viewed as major national goals” is comparable (Antikainen, 2006, pp. 229). Despite an underlying ambition for all students to have equal access to education irrespective of their gender, origin, socioeconomic status, or cultural background, differences are noticeable concerning the relation between student background and achievement across the Nordic countries (OECD, 2016a, 2016b). Even with major similarities in compulsory education in the Nordic countries, differences among school systems (OECD, 2016a, 2016b) and cultural and lingual diversity in most classes in the Nordic schools have been observed (Björnsson, 2020). This makes it relevant, from an equity viewpoint, to investigate the role of supportive climates in enhancing educational outcomes.
Conceptualization of supportive climates
In the educational context, there is a consensus among researchers that teachers play an important role in student learning and that instructional quality is a key determinant of educational outcomes (Bellens et al., 2019; Nilsen & Gustafsson, 2016; Praetorius & Charalombous, 2018). Prior studies identified three key dimensions describing instructional quality: teacher support, cognitive activation and classroom management (Klieme et al., 2009; Kunter et al., 2013). Literature reviews on instructional quality have captured different measures and definitions of supportive climates from the perspective of teacher–student relationships, such as feedback on assessment (Vieluf, 2013), interactions in classrooms (Danielson, 2007), and emotional guidance from teachers (Pianta & Hamre, 2009) besides support from teachers (Seidel & Shavelson, 2007).
Newer understandings of instructional quality have been characterised by conceptually overlapping domains. In addition to instructional clarity, cognitive activation and discourse, Klette (2015) identified a supportive climate as one of the key features of classroom teaching and learning. In this context a supportive climate captures both (a) interpersonal dynamics characterized by mutual respect and perceptions of fairness and (b) good classroom management procedures.
Although the conceptualizations may vary across frameworks, the core aspect of teacher support overlaps with interpersonal dynamics characterized by mutual respect. Findings regarding a supportive climate also revolve around teachers’ mindful efforts to be fair and impartial within safe learning environments (Klieme et al., 2009) and students’ need to experience respect and support from teachers (Baumert et al., 2010; Praetorius et al., 2014).
Classroom management refers to managing student behaviour in class (Pianta & Hamre, 2007; Van Tartwijk & Hammerness, 2011). This includes teacher actions that help to incorporate rules and methods in organizational and scaffolding strategies in teaching (Klusmann et al., 2008). However, merely maintaining rules and regulations in class is insufficient, as teachers also need to adopt strategies to reduce interpersonal conflicts (Kunter et al., 2013). As pointed out by Ma and Willms (2004), fewer disciplinary problems and more teacher support in science lessons are key requirements for a classroom climate that is conducive to learning, which can in turn improve student achievement (Howes et al., 2011).
Supportive climates and educational outcomes
In the school and learning contexts, supportive climate provided by the teachers in form of support, recognition and facilitation of knowledge development comes across as a vital element due to its effects on both cognitive and non-cognitive outcomes (Wang et al., 2020). Empirical studies on instructional quality examine supportive climates considering explicit aspects of teacher-student interactions and their connection with student motivation and academic achievement (Burić & Kim, 2020; Scherer & Nilsen, 2016; Scherer et al., 2016).
Teacher support and its association with student achievement
Many studies have identified teacher support as an integral part of the teacher-student relationship. The teacher support construct covers providing students with both academic support (e.g. in the form of encouragement and facilitating the process of learning) and emotional support (e.g. in the form of involvement, acceptance and trust) (Pitzer & Skinner, 2017; Wentzel et al., 2018). Empirical studies show a significant and positive association between teacher support and student achievement (Yildirim, 2012; Wong et al., 2018). Jimerson and Haddock (2015) found that teacher support facilitates students’ positive academic and social-emotional outcomes, such as promoting student engagement (Lipowsky et al., 2009) and students whose teachers were perceived by them as unsupportive were more likely to appear disengaged in class activities (Klem & Connell, 2004).
Moreover, teachers supporting their students in solving difficulties in both instructional activities (e.g. instrumental support) and outside the classroom (e.g. emotional support) boost students’ achievement motivation (Chen & Guo, 2016; Klieme et al., 2009; Pitzer & Skinner, 2017). Teacher support significantly correlates with students’ development of subject-specific interests (Fauth et al., 2014), resulting in an additive “effect” that positively contributes to students’ achievement and learning motivation, intrinsic beliefs, and increased sense of well-being (Burić & Kim, 2020; Dietrich et al., 2015; Praetorius et al., 2018; Scherer & Nilsen, 2016).
Teacher fairness and its association with student achievement
Interaction with a teacher who is perceived as fair builds a positive relationship between student and teacher (Colquitt, 2001). Teacher fairness has also been associated with positive outcomes in compliance with class rules (Colquitt, 2001), general well-being, and security (Hattie, 2013; OECD, 2017a, 2019). Contrarily, unfair treatment by teachers reinforces perceived bias, and students experiencing this show lower levels of performance (Burns et al., 2020; Deal & Peterson, 2016). Perceptions of teacher fairness as an aspect of a supportive climate are also important to understand the effects of students’ immigrant status. Teachers who come across as respecting students regardless of their cultural, ethnic, or racial background have positive influences on student well-being and motivational and educational outcomes (Krane et al., 2017). As Colquitt (2001) puts it, besides promoting student well-being, teacher fairness is also associated with compliance with class rules. Thus, students are more likely to follow the rules and the teacher's advice if they experience their teacher as fair and just.
Feedback from teachers and its association with student achievement
Timely feedback on assignments and supervision on schoolwork also contributes to developing a strong supportive climate, improving students’ self-efficacy, and raising student competence and skill levels, thereby influencing both motivation and achievement (Burić & Kim, 2020; Hattie, 2009) Regarding perceived feedback from science teachers at the student level, significant associations with science achievement affirm that low-performing students share considerably higher perceptions of getting more feedback and scored significantly higher on the perceived feedback scale than high-performing students (Sortkær, 2018). Moreover, feedback provided on time can provide the students with descriptions of what tasks they can do and need to be more attentive to in the context of school lessons (Lipko-Speed et al., 2014). Specifically, perceptions of getting feedback is vital for students’ construction of knowledge in science class (Fauth et al., 2014), supporting results from prior research that feedback from teachers is important for students’ learning and enhanced attainment (Klieme et al., 2009; OECD, 2017a). Hattie and Timperley (2007) emphasized that feedback from teachers can be both positive (supportive and ongoing) and negative (concentrating on the performance gap) and has differential effects, depending on the way in which the teacher provides feedback. Research findings on the effects of feedback on student performance are also not uniform, as feedback from teachers is not always positive. Feedback focusing on the negative aspects of school work tends to lower both confidence to achieve and self-esteem (Weaver, 2006), which result in poorer academic performance. Moreover, students struggling to keep up with others are more dependent on teacher feedback with regard to the solid support structures that help in the construction of knowledge and skills (Hattie & Timperley, 2007). Sortkær (2018), in his study based on PISA results, also demonstrated that the students who have the greatest need for feedback are those who perform poorly, including non-native students and students of low socioeconomic status.
Disciplinary climate in class and its association with student achievement
The disciplinary climate is yet another noteworthy variable that influences students’ academic performance, and a positive disciplinary climate is a facilitator for student learning (Berkowitz et al., 2017; Klieme & Kuger, 2014; OECD, 2017a; YetiŞir & Kaan, 2021). Often examined as a dimension of instructional quality, classroom discipline is typically significantly and positively related to academic learning (Atlay et al., 2019; Bellens et al., 2019; Ning et al., 2015; Scherer et al., 2016; Sortkær & Reimer, 2018). Nevertheless, non-trivial relationships between disciplinary climate and achievement have also been found (Sortkær & Reimer, 2016). Researchers are also constantly examining the association between this construct and student achievement in relation to other student variables, such as gender and socioeconomic status, as well as the role of disciplinary climate as a moderator, or mediator, in its relationship with student achievement. Ning et al. (2015) suggested that a good disciplinary climate moderates the SES–achievement relationship, whereas other researchers (Liu et al., 2015) have provided evidence that it mediates the relationship between SES and achievement. Furthermore, Sortkær and Reimer (2016) suggested that disciplinary climate plays a moderating role in the association between inquiry effectiveness and student academic outcomes.
Furthermore, reports from international large-scale studies, such as the Trends in International Mathematics and Science Study (TIMSS) and Progress in International Reading Literacy Study (PIRLS), characterize behavioural problems, such as an adverse school climate, as detrimental to students’ focus on learning (Bellens et al., 2019; Nilsen & Gustafsson, 2014). Good disciplinary climate contributes to building teacher–student relationships (Tosto et al., 2016) and enables students to internalize teacher feedback (Ning et al., 2015). Overall, a positive disciplinary climate influences student engagement, student attitudes, students’ well-being, and their educational outcomes (Burić & Kim, 2020; Hattie, 2009; Praetorius et al., 2018).
Student backgrounds and school context
Based on theoretical and empirical considerations, we identified certain student background and school variables to review in this study, as students’ background characteristics. These are gender, ethnicity, socioeconomic status, and educational aspirations, which correlate with academic outcomes (OECD, 2016a, 2016b; Martin et al., 2016). Bijou and Liouaeddine (2018) report that in 72% of all the participants in 2015 PISA, boys performed better in science. Gender differences in favour of boys have earlier been noted for science achievement in a sample of 15-year-old students (Sun et al., 2012) which influence the selection of STEM (Science, Technology, Engineering, and Mathematics) subjects in higher studies (Fredricks et al., 2018). Further, Liou et al. (2020) highlighted the role of both gender and grade level differences in the motivational beliefs held by students about learning sciences and associated achievement in these fields. Minority students tend to report less favourable attitudes towards academics (OECD, 2016a, 2016b, 2019; Way et al., 2007) such that ethnicity is also an important factor for the observed differences in academic outcomes (Farkas, 2017). Family background, particularly socioeconomic status, plays an important role in predicting a student's academic performance (Chiu & Klassen, 2010; Hansen & Gustafsson, 2016; Mullis et al., 2020; OECD, 2013, 2016a, 2016b, 2019; Sirin, 2005). The link between socioeconomic status and academic outcomes noted earlier is becoming more marked in most countries and within the Nordic countries (Hansen & Gustafsson, 2016; Harwell et al., 2017). Students’ prior achievements along with family and peers are a source of inspiration in forming and stimulating students’ perceptions about future studies (Chow et al., 2012; OECD, 2017a; Broeck et al., 2020).
As part of the organizational dimension of school climate, school characteristics help shape common values and beliefs regarding instruction, sense of belonging, and general well-being as components of group-level perceptions (Bronfenbrenner, 1992). The quality of school facilities acts as a mediator that affects student achievement through the school climate (Uline & T-Moran, 2008; Welsh et al., 2000). A school’s physical and structural features, such as school size and school type (e.g. public vs. private) also impact student achievement and school climate (Rudasill et al., 2018). In their study, Welsh et al. (2000) connected school size to negative student behaviour, emphasizing that large school size relates to disorder and negatively influences student–teacher relations. Instruction disruption due to negative conduct and safety concerns also causes disorder in schools, affecting academic outcomes (Ma & Willms, 2004; OECD, 2016a, 2016b).
Nordic schools and equity perspective
The role of instructional quality in bridging the achievement gap between students from different backgrounds is a topic of great concern for researchers leveraging instructional quality as a tool to mitigate differences arising from individual factors (Kyriakides & Creemers, 2011). Specifically, when it comes to studying the contribution of socioeconomic status (SES; henceforth, ESCS, denoting the PISA index of economic, social, and cultural status) to science achievement, varying degrees of differences are observed in the Nordic countries. Hansen and Gustafsson (2019) highlighted sizeable variations in the Nordic countries due to ESCS, particularly in Sweden, contrary to the general conception of similar school systems across the Nordic countries.
The present study
We aim to understand the importance of a supportive climate for student achievement using the self-perception of students regarding four aspects of a supportive climate—teacher support, fairness, feedback, and class discipline—based on the Programme for International Student Assessment (PISA) data from 2015. We study the effect of each of the four aspects of supportive climate and identify important factors accounting for the variations in science achievement. Keeping in mind the alterable nature of the supportive climate, it is also of interest to study the influence of pre-existing factors such as student background and school characteristics on academic outcomes. In particular, we address the following questions:
RQ1: To what extent are students’ perceptions of supportive climate and students’ science achievement related, when controlling for student backgrounds at the student level and the school characteristics at the school level, in Denmark, Finland, Iceland, Norway, and Sweden?
RQ2: To what extent can students’ backgrounds and school characteristics explain the variations in students’ science achievement in the Nordic countries?
The PISA international survey has been administered to 15-year-olds by the Organisation for Economic Cooperation and Development (OECD) every 3 years since 2000. One domain out of the three major areas—reading, mathematics, and science literacies—receives additional focus in each cycle. In 2015, the PISA assessments focused on science literacy (as in 2006), and data was collected from 540,000 students from 72 countries using an online test. Further, background data were obtained through student and school questionnaires. As described in the scientific literacy assessment framework, the topics covered the following areas: health and disease, environmental quality, hazards, and frontiers of science and technology addressed under three competence areas comprising explaining phenomena scientifically, interpreting data and evidence scientifically, and evaluating and designing scientific inquiry (OECD, 2016a). The scientific literacy achievement scores are reported as 10 plausible values (hereafter PV) for each student [we refer to OECD (2017b)] for more details about the plausible-value technique). Constructed scales based on the responses describe students’ performance, where positive scale scores represent the more positive responses across OECD countries (OECD, 2016a, 2016b).Footnote 1 The OECD and the national institutions administering PISA 2015 adhered to the human subject research guidelines and approval (OECD, 2017b).
In the 2015 PISA, a stratified two-stage cluster sample was used where schools were sampled at the first stage, and around 30 students (15-year-olds) across grades were sampled from each school in stage two (systematic probability proportional to size [PPS]). We analysed data for five countries in the Nordic region: Denmark, Finland, Iceland, Norway, and Sweden. In Iceland, all 15-year-olds were included in the sample whereas a two-step sampling was done in other four countries.
According to OECD (2016a, 2016b) report, 4/5 of these students in Denmark, Finland and Sweden were in grade 9, whereas almost all were in grade 10 in Iceland and Norway. The number of students and schools for each of the five countries are presented in Table 1.
Keeping the two-stage random sampling design of PISA in mind, students’ self-reports do not necessarily refer to the same science teacher (OECD, 2017b). These reports are broadly associated with students’ conduct in the classroom (Atlay et al., 2019) and project collective perceptions of students within a school. In our understanding, this limits the scope of ascribing classroom management to a particular teacher and does not measure as a classroom-level construct would (Klieme, 2013; Marsh et al., 2012).
We attribute the study variables to four major constructs—science achievement, supportive climate—based on students’ perceptions as individual-level indicators, student background, and school characteristics as school-level measures (Table 2). Scale indices, standardized with a mean of zero and a standard deviation of one, were generated by the OECD using item response modelling. Unless specified otherwise, a positive value indicates a higher incidence of the measured phenomenon. Gender was recoded (girl = 0, boy = 1) and the immigrants’ backgrounds were also sorted using dummy coding (native student = 1, other = 0), where IMMIG1 represents a first-generation immigrant student (i.e. a student born outside the country and whose parents were also born in another country), and IMMIG2 represents a second-generation immigration student (a student who was born in the country but whose parent/s were born in another country).
Considering the PISA data’s hierarchical structure, we employed a statistical approach called multilevel linear modelling (MLM). In addition to providing correct standard errors, MLM allowed us to deal with the clustering in our data in order to investigate relationships at more than one level (Hox et al., 2010; Snijders & Bosker, 2011).
The data was prepared using SPSS Version 26, and all analyses were conducted in the statistical package Mplus Version 8.3 (Muthén & Muthén, 1998). Given that PISA 2015 applied a two-stage random sampling procedure, differences in the probabilities of being selected as a study participant occurred (Asparouhov, 2005). To adjust for these differences in all analyses, we used the students’ final weights along with school weights (Mplus option WEIGHT = WGT; BWEIGHT = WT_SCH) with both weights being scaled to account for the complex design. Additionally, group-mean centring for student level to examine the effects of student-level and school-level variables independently (Enders & Tofighi, 2007) and grand-mean centring for school level measures were used. Maximum likelihood with robustness to non-normality (MLR) was used to estimate all models, whereas for the handling of missing data, full information maximum likelihood (FIML) was used. The analyses were conducted based on all ten plausible values, and the resultant model parameters for each value were pooled following Rubin’s combination rules through the TYPE = IMPUTATION option in Mplus. Implementing Hu and Bentler’s (1999, p. 27) recommendations, we used a combination of the cut-off for the comparative fit index (CFI) and the Tucker-Lewis fit index (TLI) as > 0.95 and the standardized root mean square residual (SRMR) as 0.09 for model fit evaluation. A root mean square error of approximation (RMSEA) > 0.06 was used as a cut-off.
As a starting point, an unconditional model (a model without explanatory variables) for each country was examined, which indicated supporting evidence for multilevel modelling when intra-class-coefficient (ICC: the proportion of the between-school variance on the overall variance) estimates were above 0.05 (Geiser, 2012). Subsequently, in a single-level and two-level analysis, a set of intermediate models (results in Tables 4, 5, 6) were examined.
We started our analysis by examining the student-level variables in two steps. Firstly, to test the relationship between the four supportive climate variables and the outcome variable (science achievement) collected through student ratings (Model 1 in Table 4). In the next model, we controlled for the background variables; gender, immigrant status, socioeconomic background, and educational aspirations (Model 2 in Table 4). We then tested the relationships at the school-level, out of interest in only explaining differences in the school means of supportive climate variables on the outcome variable in Model 3. Later the control variables (school characteristics) were added in Model 4. Table 5 presents the results for these models with only school level predictors.
Finally, we performed a multilevel analysis and examined the variables on both levels in two separate models (Table 6). In step one, an initial model with only predictors of interest was developed. We included school-level means for supportive climate indicators at school-level, in addition to level-1 predictors in Model 5, to understand the differences between within-school and between-school regressions. These school means represent the average perceptions of all 15-year olds students in the same school and are taken as proxies for the school situation. This allowed us to analyse the additional effect of the student composition on achievement in addition to the individual perceptions of supportive climate.
Subsequently, in Model 6, control variables at both levels (student background and school characteristics) were added to supportive climate indicators. A set of fully standardized beta coefficients (in units of standard deviations), fixed for each country, enabled us to compare their regression coefficients. We note that due to missing data, there are fewer students in all countries compared to the number of students in the sample (Table 1). For comparison of the relative fit of two competing models, a reduction in the deviance (multiplying the Log Likelihood by minus 2) from the initial model to the full model was registered for each country, and the “just identified” regression model had a perfect model fit with the CFI and TLI values at 1.00.
We examined the descriptive statistics (means and standard deviations) of the studied variables related to supportive climate indicators, student background, and school climate characteristics. Given that the mean and standard deviation (SD) for variables were scaled towards the international scores with an OECD mean of zero and an OECD standard deviation of one, no ceiling or floor effects were visible in the data in any of the five countries. The scale reliabilities for the indices presented in Table 2 are reported in the 2015 PISA report (OECD, 2017b). Cronbach’s alpha was only substantial for the measurement of ‘negative student behaviour hindering learning’ a variable at school level. Otherwise, almost all other scales showed acceptable values above 0.80, which can be considered as high in all five countries. Reviewing the correlations and using Cohen’s d (1992) description (low around 0.1, medium around 0.3, and large if more than 0.5), we found that the supportive climate measures used in this study significantly correlated with each other in all countries in a low to medium range with magnitudes between 0.18 and 0.40. Results from the unconditional means model showed that the ICC estimates were 0.19 for Denmark, 0.08 for Finland, 0.04 for Iceland, 0.09 for Norway and 0.16 for Sweden. These ICC values indicate that the mean science achievement scores vary notably across schools in four countries and also to a small extent in Iceland, providing us with an important argument for conducting a two-level analysis (Geiser, 2012; Hox et al., 2010).
Results from the student-level analysis
We started out testing the predictors of interest in Model 1 and later added the control variables in Model 2 (Table 4). The addition of control variables reduced the Log Likelihood values from −142,828.72 (SD = 79.46) for Model 1 to −138,462.38 (SD = 77.78) for Model 2.
Generally, no substantial changes were noticed in the supportive climate variables after the inclusion of control variables. Teacher support remained significant in Denmark, Finland, and Norway, but with low effect sizes. Coefficients for teachers perceived as being fair were positive in all five countries, with effect size indicating a weak relationship with achievements. The results also show that perceived feedback from teachers was significant after controlling for student background, but a very weak relationship was observed. Trivial relationships were also observed for class discipline only in Finland. Overall, when it comes to the students’ background indicators, gender was significant in four out of five countries, but not significant in Finland. Furthermore, results show that native students have an advantage over students with immigrant status in all five countries. However, second-generation students have a slight advantage over first-generation students in four countries except Denmark (where first-generation students performed equally well as the second-generation students; OECD, 2016b). Students’ future educational aspirations and their ESCS however contributed to their achievements.
Results from the school-level analysis
In a separate analysis, the variables at the school level were tested in Model 3 and in Model 4 with control variables (Table 5). The results from Model 3, with school means of supportive climate variables revealed both significant and in the range (low to medium effect size) and non-significant associations between school factors and science achievement. The addition of control variables reduced the Log Likelihood values from −161,528.57 (SD = 90.94) for Model 3 to −139,740.25 (SD = 81.72) for Model 4. For means of perceived feedback significant coefficients were noticed in Denmark, Iceland and Sweden whereas coefficients were significant for mean class discipline in four countries except in Denmark in Model 3.
With school means for four student-level variables along with control variables at school-level in Model 4, the aggregated teacher support score displayed a significant regression coefficient only in Norway (β = 0.35, SE = 0.15, p < 0.05). The coefficient for mean fair teacher perceptions in Model 4, was almost significant in Norway (β = 0.21, SE = 0.11, p = 0.051) and in Sweden (β = 0.20, SE = 0.11, p = 0.059) on the addition of control variables. On average, student responses in four countries besides, Finland showed a significant relationship between mean perceived feedback and science achievement. Coefficients were significant and in the range (low to medium effect size) for mean class discipline in Iceland, Norway, and Sweden. In comparison to Model 3, coefficients for class discipline indicated a marginal significant level in Finland (β = 0.28, SE = 0.14, p = 0.051) and Norway (β = 0.27, SE = 0.14, p = 0.058).
Student behaviour hindering learning at the school-level displayed negative associations in all countries but was only significant in Denmark and Norway. The student–teacher ratio was significantly linked to science achievement only in Sweden, whereas no association between school-type and science achievement was found in the analysis. The shortage of educational resources hindering learning yielded weak links only in Iceland. The explained variance at the between level increased from Model 3 to Model 4 in four countries with the exception of Finland.
Results from the two-level analysis
In a multilevel model we are able to analyze both inter-school and intra-school relationships between the independent and the dependent variables. The Log Likelihood value for Model 5 was −142,647.16 (SD = 79.14), which reduced to −119,684.47 (SD = 66.57) for Model 6.
Our results showed that individual background variables partially account for the student-level variance, as the “within-level” variables were associated with student science achievement in Nordic countries. Though the noted standardized regression coefficients are significant for the background indicators, they are weak to modest.
Further, the estimated R2 value for the student-level in Model 6 (Table 6), ranged from 17.7 to 24.9% in the countries studied. At the between-school level, the estimated R2 value ranged from 34.0 to 69.1%. However, the latter explains a smaller proportion of the overall variation in achievements. This is due to the low ICC values, which show that the contribution of the variables at the student level was relatively more profound than the contribution at the school level. Most of the variance in achievements was therefore due to students’ individual differences. Considering the four supportive climate variables at the student level, significant beta coefficients were observed in Model 6 for variables teacher support, teachers being fair, and perceived feedback from teachers, which were almost the same magnitude as in Model 1 (Table 4). Class discipline at the student level was significant only in Finland. Results for student background variables in Model 6, illustrate that science achievement is significantly influenced by students’ individual characteristics, such as gender (except for Finland), immigrant status, socioeconomic status, and their future educational aspirations.
In particular, at the school level in Model 6, significant beta coefficients for mean teacher support were noted in Denmark and Norway, compared to nonsignificant coefficients in Model 4 (Table 5). For mean perceived feedback, significant coefficients were also noted for Denmark, Iceland, and Sweden. Class discipline was noted as significant for Finland at the school level, compared to being nonsignificant in Model 4 (Table 5). For Norway, class discipline turned nonsignificant at school level in Model 6 compared to being significant in Model 4. As to the school background variables, in Model 6, negative student behaviour was significant in Denmark, and was almost significant in Norway (p = 0.60), whereas, the student–teacher ratio remained significant only for Sweden. For the variable shortage of educational resources, the coefficient remained significant only for Iceland. As almost all students in Iceland are in public schools, we also tested the model by dropping the variable school type in Model 6 which did not change the results. Overall, our findings indicated cross-country differences, not only in the sizes of the relations among supportive climate variables and science achievement but also in the conclusions following them.
In this section we are trying to discuss how our findings can be interpreted and related to the research questions posed in this study.
Supportive climates and students’ science achievement
Our findings thus far revealed similar associations between countries along with distinct country-specific characteristics when it comes to the contribution of the four different aspects of supportive climates to science achievement at both levels.
In our analysis, significant but minor relationships between teacher support and science achievement at the student level were visible in Denmark, Finland, and Norway. Notably, these trivial relationships make it difficult to draw meaningful conclusions about their overall contribution. Teacher support is comprehended as both instrumental support and emotional support provided by teachers through the five items capturing this scale (OECD, 2016a, 2017a). Further, its visibility as a predictor at the school level in Denmark and Norway highlights its underlying importance. Danish and Norwegian students perceive that they receive the required support in science lessons when necessary. The weak associations in our results do not, however, undermine the significance of teacher support as an aspect of supportive climates, as students who perceive greater support from teachers largely score higher in reading, after accounting for the socio-economic profile of students and schools (OECD, 2019, vol. III).
Our study emphasizes the relationship between students’ perceptions of fair teachers and their science achievement. In all the five countries, a significant association were found between perceptions of fair teachers and their science achievement while controlling for students’ backgrounds at level 1 (Table 4 and 6). Regarding the supportive climate at the school level, this finding underlines the positive contribution of strong interpersonal relationships between students and teachers in schools. As students facing unfair treatment are more likely to report feeling isolated and experience negative well-being (Burns et al., 2020; OECD, 2017a), our results highlight the importance of teacher fairness as a vital sign of a supportive climate.
Concerning perceived feedback, significant relationships with science achievement at level 1 were noted (Table 4 and 6), emphasizing the value of feedback given by teachers. Moreover, our results support the findings by Sortkær (2018) regarding the contribution of feedback to the activation of students’ understanding of subject content and thus its academic benefits (Hattie & Timperley, 2007). However, the negative association of feedback with performance possibly indicates that teachers provide individual feedback to students showing lower competence in the Nordic countries (OECD, 2016a, 2017a).
In contrast to Atlay et al.’s (2019) results, no links between class discipline and science achievement at level 1 were observed for the countries, except for Finland, in our study (Table 6). Even though this construct interrelates with the disruptions and undesirable student conduct in the classroom, due to PISA’s sampling design, where students were sampled across the grade level, students’ perceptions about disciplinary climate do not always describe the direct actions of a particular science teacher at the class level (Aditomo & Köhler, 2020).
School-level effects of supportive climate variables
Following recommendations to model collective opinions, as students within the same classroom or school share the same perceptions (Kunter et al., 2007; Scherer et al., 2016), we observed positive and significant effects of the school average as a proxy for the school situation for individual measures in certain countries. As the averages of group perceptions, these contextual variables describe the differences between within-school and between-school regression coefficients. The school level effect is added to the student-level effect.
Danish and Norwegian students are undivided in their perceptions of teacher support, as their strong connections with achievement were observed at both levels. Even though the students perceived getting less support from their teachers compared to other countries, less support in these two countries was more effective to students’ achievement at school level. In addition, school means of perceptions of teachers being fair at the school level, were not significantly related to achievement in any of the five countries, despite being significantly related to achievement in all countries at the student level. One explanation could be that the comparison of a country’s average assessments may be misleading as the assessment of behavior differs from person to person and varies in different cultures (Kjærnsli & Lie, 2011).
As an aggregated factor at the school level, perceived feedback also had strong associations with science achievement in Denmark, Iceland, and Sweden (Table 6). Mean and standard deviation (Table 3) for the students in Iceland shows the lowest value for perceived feedback among other Nordic countries. However, looking at the results of multilevel analysis in Table 6, the school-level beta coefficients for perceived feedback (school average) were higher at the school level for Iceland. While, the questions in the feedback construct focus on individual-level (e.g. “The teacher tells me how I can improve my performance”), a possible explanation could be the relatively low science achievement in Iceland compared to other countries and high level of immigrant students in Demark and Sweden.
Disciplinary climate factor shows better predictive powers at the school level rather than at the student level in three countries in our study. In our analysis at the school level, the school mean class discipline was modestly associated with science achievement in Finland, Iceland and Sweden. For Finland the association between disciplinary climate in science classrooms and science achievement at student level was also significant as shown in research (Grabau & Juuti, 2021). According to Broeck et al. (2020), the compositional effect due to shared group perceptions to a certain extent disguise school effects. We suspect that more culturally homogenous (e.g. few students with foreign backgrounds) might contribute to this association. This reasoning however becomes insignificant in the case of Sweden, where the link between mean class discipline and science achievement was much stronger (Table 6), even though the schools are heterogeneous. This aspects motivates us to further examine this relationship in detail.
Specifically, fairness and feedback-related questions ask more individual experiences (e.g., Teachers called on me less often than they call on other students” or “The teacher gives me feedback on my strengths in this subject”) while teacher support and discipline-related questions ask more collective experiences (e.g., “The teachers’ show an interest in every student’s learning” or “There is noise and disorder”). With regard to the variable disciplinary climate in particular, due to the structure of the data (students in schools), the compositional effects visible at school level could be due to an intermediate level (e.g. class level) (Scherer et al., 2016).
Considering the teacher’s role, the two measures teacher support and perceived feedback from teachers overlap in their conceptual understanding of engaging and enabling students in teacher actions to facilitate student learning (Dietrich et al., 2015). Earlier findings that Nordic science teachers are more attentive to low-performing students (Sortkær, 2018) further support this reasoning. Substantiating the fact that focusing on interpersonal dynamics contributes to improving the learning environment (Bryk et al., 2010; Burns et al., 2020), we emphasize that our use of four aspects from the PISA school climate construct aligns with the dimension of supportive climate. The analysis provides insight into the empirical understanding of the role of supportive climates, understood through student reports on four aspects.
Further, our results highlight teacher–student relations, in terms of teachers who address students’ personal needs through providing feedback and being fair, as pivotal in developing a positive supportive climate. Also, though we concentrated on the relationship between supportive climates and science achievement, the interpretation of our results can be transferrable beyond the science subjects, as none of the four aspects of supportive climate in consideration are strictly subject-specific. Within the backdrop of our research question, it seems pertinent that teachers, by being adaptable and flexible, have a major role to play in creating strong and positive teacher–student relationships while sustaining a positive supportive climate (Coleman, 1988; Hattie, 2009; Krane et al., 2017; Kyriakides et al., 2009). Keeping this in mind, educators and school authorities can implement plans intended to raise instruction quality in their efforts towards moderating differences in academic outcomes for all students.
Student background and school characteristics in relation to science achievement
Concerning, student background, regression coefficients (in Table 6) were very small in magnitude but still were statistically significant. This could be partly due to the large sample sizes; nevertheless, as they individually contribute to understanding the explained variance, we find that student background characteristics are relevant for explaining variation in science achievement.
Gender differences, noted earlier as in favour of boys concerning science performance (Sun et al., 2012), were not marked as such despite being statistically significant in our analysis for four Nordic countries. In recent years, girls have been catching up with boys in science and performing even better, as seen in results from Finland (OECD, 2016a; Mullis et al., 2020). It was observed that gender effects increased, while immigration effects decreased, with the addition of the four supportive climate factors, demonstrating that a supportive climate has different effects in the Nordic countries depending on the student background. In all the five countries, the socioeconomic status contributed significantly to the variance in science achievement, underscoring its concern for both administrators and policymakers, as also noted by Yang Hansen and Gustafsson (2016).
In addition, our results demonstrated a significant association between students’ future educational aspirations and their science achievement. There may be a mutual influence between having one's own ambitions about further education and higher levels of science achievements (Broeck et al., 2020). But the cross-sectional design in PISA does not make it possible to go further into this relationship and to explore the potential reciprocity.
The education systems in the five countries aim to provide equal opportunities to all students while maintaining a certain quality of instruction. Perhaps due to this, school-level variables (Table 6) such as school type and shortage of educational resources are not found to be significant predictors of science achievement in all Nordic countries. The uneven influence of school-level factors in terms of large student–teacher ratios in connection to school size was influential only in Sweden and Denmark. In contrast, school variables in Icelandic schools had less practical significance than the other four countries, as they do not explain much of the variance.
Regarding the considerable between-school variations in both Denmark and Sweden, it appeared relevant to investigate the associated variables and their contribution to explained variance in science achievement. Sources of variation in Denmark and Sweden could include the significant immigrant population and increased number of students attending private-dependent schools (e.g. 23.2% and 17.8% of Danish and Swedish students, respectively, attended these in 2015; OECD, 2016b). Further, besides the expansion of the private school sector, the Swedish school systems differ in terms of school size, coupled with unevenly distributed teacher quality between schools (The Swedish National Agency for Education, 2017). About the large student–teacher ratio and its positive connection to student achievement, research has shown that maintaining top school discipline is a difficult task for teachers and principals in large schools (Welsh et al., 2000). Compared to small-sized schools, school staff in large sized schools face much greater disciplinary problems, emphasizing the importance of a smaller schools size when it concerns upholding class discipline in science lessons (OECD, 2016b).
Taken together, student background and school characteristics appeared to influence science achievement in Finland, and Iceland to a smaller extent, as shown by low variations between schools (Table 6). Also, despite a link between socioeconomic status and science achievement, disparities between Finnish schools were also lowest among the Nordic countries (Ahonen, 2021; OECD, 2016a, 2016b, 2019). In comparison to Finland and Iceland, in the last decade, Sweden has seen a major increase in the proportion of immigrant students (first- and second-generation), followed by Norway (OECD, 2016a, 2016b). With the arrival of large number of migrants from non-European countries, the immigrant students experiencing language and socioeconomic status gaps might have contributed to observed differences (Bilgili et al., 2018). It is to be registered that though the percentage of immigrant students has also increased in Denmark, the school composition meant relatively little to how well the students performed on the PISA test (Greve & Krassel, 2017). In summary, for comparisons across countries, the contrasting findings indicate that among the Nordic countries with equity characteristics in education, far greater heterogeneity in schools is reflected in Sweden. To some extent, these between-school variations are also increasing in Denmark and Norway.
In the equity perspective, the differences in the five countries also point to deviations from the concept of a school for all, underscoring the growing influence of student background factors on educational outcomes. Though the Nordic education model is based on a vision of equity, the vision is evident in rhetoric to a certain extent (e.g. curricula and allocation of resources Volckmar, 2019). Given that this study’s purpose is to show the difference in variance explained at the student and school levels when looking at the results between the countries, our selection of variables limits our efforts to explain the underlying mechanisms. Nevertheless, the observed variations in student achievement in the Nordic countries yield a need for effective instructional methods for compensating non-native students with lower ESCS which might help alleviate the observed inequalities.
Limitations and outlook
First, the cross-sectional design in this study restricted the ability to identify causal effects. For investigating possible causality issues, research designs involving longitudinal analysis are recommended. Second, this study addressed only selected student characteristics and school aspects concerning supportive climates, which is not an exhaustive description for understanding associations with students’ academic environment. We focused only on students’ perceptions, excluding teachers’ perspectives, as no teacher data was available for the Nordic countries in the PISA study. Concerning the questionnaire, another limitation is that the questions related to the fair teacher factor were not related to science lessons, while the other three factors were closely related to the science lesson, which may have affected both students’ responses as well as the results. Last, we did not test the different constructs for measurement invariance across the countries, imposing restrictions on the interpretation of associations in cross-country comparisons.
Despite these limitations, our cross-country comparison of the influence of supportive climates contributes to the understanding of the range of variables and variations in associations related to supportive climates. Given that strong associations exist between supportive climates and student motivation (Krane et al., 2017) and between motivation and achievement (Hattie, 2009), future research should be guided to incorporate other relevant contextual factors and methodology that would provide additional information on supportive climates relating to both cognitive and non-cognitive outcomes.
This study examined the impact of a supportive climate using four aspects: teacher support, fairness, feedback, and class discipline. Using multilevel modelling with 2015 PISA data for five Nordic countries, we found that student perceptions of teacher fairness and feedback from teachers matter most in association with students’ science achievement while controlling for relevant student, family, and school variables. Aggregated perceptions of class discipline at the school level also contribute to better student achievement in science. Our results accentuate the importance of understanding supportive climates in a broader sense and the pertinence of stronger teacher–student relationships in enhancing educational outcomes.
Availability of data and materials
The study represents a secondary data analysis of the public use PISA 2015 file provided by the OECD. The PISA 2015 Science data for Norway have been made publicly available by the OECD and can be accessed at: https://www.oecd.org/pisa/data/2015database/.
The 2015 PISA Science data are publicly available at the OECD site (https://www.oecd.org/pisa/data/2015database/).
Aditomo, A., & Köhler, C. (2020). Do student ratings provide reliable and valid information about teaching quality at the school level? Evaluating measures of science teaching in PISA 2015. Educational Assessment, Evaluation, and Accountability, 32(3), 275–310.
Ahonen, A. K. (2021). Finland: Success through equity—the trajectories in PISA performance. In N. Crato (Ed.) Improving a Country’s Education (pp. 121–136). Springer, Cham. https://doi.org/10.1007/978-3-030-59031-4_6.
Antikainen, A. (2006). In search of the Nordic model in education. Scandinavian Journal of Educational Research, 50(3), 229–243.
Asparouhov, T. (2005). Sampling weights in latent variable modeling. Structural Equation Modeling, 12(3), 411–434.
Atlay, C., Tieben, N., Hillmert, S., & Fauth, B. (2019). Instructional quality and achievement inequality: How effective is teaching in closing the social achievement gap? Learning and Instruction, 63, 101211.
Baumert, J., Kunter, M., Blum, W., Brunner, M., Voss, T., Jordan, A., Klusmann, U., Krauss, S., Neubrand, M., & Tsai, Y. M. (2010). Teachers’ mathematical knowledge, cognitive activation in the classroom, and student progress. American Educational Research Journal, 47, 133–180. https://doi.org/10.3102/0002831209345157.
Bellens, K., Van Damme, J., Van Den Noortgate, W., Wendt, H., & Nilsen, T. (2019). Instructional quality: Catalyst or pitfall in educational systems’ aim for high achievement and equity? An answer based on multilevel SEM analyses of TIMSS 2015 data in Flanders (Belgium), Germany, and Norway. Large-Scale Assessments in Education, 7(1), 1.
Berkowitz, R., Moore, H., Astor, R. A., & Benbenishty, R. (2017). A research synthesis of the associations between socioeconomic background, inequality, school climate, and academic achievement. Review of Educational Research, 87(2), 425–469.
Bijou, M., & Liouaeddine, M. (2018). Gender and students’ achievements: Evidence from PISA 2015. World Journal of Education, 8(4), 24–35.
Bilgili, O., Volante, L., & Klinger, D. (2018). Immigrant student achievement and the performance disadvantage. In L. Volante, D. Klinger, & O. Bilgili (Eds.), Immigrant student achievement and education policy, policy implications of research in education. Cham: Springer.
Björnsson, J. K. (2020). Teaching culturally diverse student groups in the Nordic countries: What can the TALIS 2018 data tell us? I Frønes, Tove Stjern; Pettersen, Andreas; Radišić, Jelena & Buchholtz, Nils (Red), equity, equality and diversity in the nordic model of education (pp. 75–97). Cham: Springer. https://doi.org/10.1007/978-3-030-61648-9_4.
Bronfenbrenner, U. (1992). Ecological systems theory. London: Jessica Kingsley Publishers.
Bryk, A. S., Sebring, P. B., Allensworth, E., Easton, J. Q., & Luppescu, S. (2010). Organizing schools for improvement: Lessons from Chicago. University of Chicago Press.
Burić, I., & Kim, L. E. (2020). Teacher self-efficacy, instructional quality, and student motivational beliefs: An analysis using multilevel structural equation modeling. Learning and Instruction, 66, 101302.
Burns, E. C., Martin, A. J., & Collie, R. J. (2020). Supporting and thwarting interpersonal dynamics and student achievement: A multi-level examination of PISA 2015. International Journal of Research & Method in Education, 43(4), 364–378.
Chen, Y. L., & Guo, S. Y. (2016). Effect of perception of teachers’ supporting behaviour on academic achievement in middle school youths: A mediated moderation effect. Chinese Journal of Clinical Psychology, 24(2), 332–337.
Chiu, M. M., & Klassen, R. M. (2010). Relations of mathematics self-concept and its calibration with mathematics achievement: Cultural differences among fifteen-year-olds in 34 countries. Learning and Instruction, 20(1), 2–17. https://doi.org/10.1016/j.lindif.2007.03.007.
Chow, A., Eccles, J. S., & Salmela-Aro, K. (2012). Task value profiles across subjects and aspirations to physical and IT-related sciences in the United States and Finland. Developmental Psychology, 48(6), 1612.
Cohen, J. (1992). A power primer. Psychological Bulletin, 112(1), 155.
Cohen, J., & Geier, V. (2010). School climate research summary—January 21010 school climate brief. New York: Center for Social and Emotional Education.
Coleman, J. S. (1988). Social capital in the creation of human capital. American Journal of Sociology, 94, S95–S120.
Colquitt, J. A. (2001). On the dimensionality of organizational justice: A construct validation of a measure. Journal of Applied Psychology, 86(3), 386.
Danielson, C. (2007). The many faces of leadership. Educational Leadership, 65(1), 14–19.
Deal, T. E., & Peterson, K. D. (2016). Shaping school culture. John Wiley & Sons.
Dietrich, J., Dicke, A.-L., Kracke, B., & Noack, P. (2015). Teacher support and its influence on students’ intrinsic value and effort: dimensional comparison effects across subjects. Learn Instruct, 39, 45–54. https://doi.org/10.1016/j.learninstruc.2015.05.007.
Enders, C. K., & Tofighi, D. (2007). Centering predictor variables in cross-sectional multilevel models: A new look at an old issue. Psychological Methods, 12(2), 121.
Farkas, G. (2017). Human capital or cultural capital?: Ethnicity and poverty groups in an urban school district. Routledge.
Fauth, B., Decristan, J., Rieser, S., Klieme, E., & Büttner, G. (2014). Student ratings of teaching quality in primary school: Dimensions and prediction of student outcomes. Learning and Instruction, 29, 1–9. https://doi.org/10.1016/j.learninstruc.2013.07.001.
Fauth, B., Decristan, J., Decker, A. T., Büttner, G., Hardy, I., Klieme, E., & Kunter, M. (2019). The effects of teacher competence on student outcomes in elementary science education: The mediating role of teaching quality. Teaching and Teacher Education, 86, 102882.
Fredricks, J. A., Hofkens, T., Wang, M. T., Mortenson, E., & Scott, P. (2018). Supporting girls’ and boys’ engagement in math and science learning: A mixed methods study. Journal of Research in Science Teaching, 55(2), 271–298.
Geiser, C. (2012). Data analysis with Mplus. Guilford Press.
Grabau, L. J., Lavonen, J., & Juuti, K. (2021). Finland, a package deal: Disciplinary climate in science classes, science dispositions and science literacy. Sustainability, 13(24), 13857.
Greve, J., & Krassel, K. F. (2017). PISA Etnisk 2015: Hvordan elever med indvandrerbaggrund klarer sig i PISA-testen og deres holdninger og forventninger til naturvidenskab (p. 10599). Udgiver: KORA Projekt.
Hansen, K. Y., & Gustafsson, J. E. (2019). Identifying the key source of deteriorating educational equity in Sweden between 1998 and 2014. International Journal of Educational Research, 93, 79–90.
Harwell, M., Maeda, Y., Bishop, K., & Xie, A. (2017). The surprisingly modest relationship between SES and educational achievement. The Journal of Experimental Education, 85(2), 197–214.
Hattie, J. A. C. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to performance. Routledge.
Hattie, J. (2013). Calibration and confidence: Where to next? Learning and Instruction, 24, 62–66.
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112.
Howes, C., Guerra, A. W., Fuligni, A., Zucker, E., Lee, L., Obregon, N. B., & Spivak, A. (2011). Classroom dimensions predict early peer interaction when children are diverse in ethnicity, race, and home language. Early Childhood Research Quarterly, 26(4), 399–408.
Hox, J. J. (2010). Multilevel analysis: Techniques and applications (2nd ed). Routledge.
Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55.
Jimerson, S. R., & Haddock, A. D. (2015). Understanding the importance of teachers in facilitating student success: Contemporary science, practice, and policy. School Psychology Quarterly, 30(4), 488–493. https://doi.org/10.1037/spq0000134.
Kjærnsli, M., & Lie, S. (2011). Students’ preference for science careers: International comparisons based on PISA 2006. International Journal of Science Education, 33(1), 121–144.
Klem, A. M., & Connell, J. P. (2004). Relationships matter: Linking teacher support to student engagement and achievement. Journal of School Health, 74(7), 262–273.
Klette, K. (2015). Introduction: Studying interaction and instructional patterns in classrooms, teaching and learning in lower secondary schools in the era of PISA and TIMSS. Springer Publishing Company. https://doi.org/10.1007/978-3-319-17302-3_1.
Klieme, E. (2013). The role of large-scale assessments in research on educational effectiveness and school development. The role of international large-scale assessments: perspectives from technology, economy, and educational research (pp. 115–147). Cham: Springer.
Klieme, E., & Kuger, S. (2014). PISA 2015 draft questionnaire framework. Paris: OECD. http://www.oecd.org/pisa/pisaproducts/PISA-2015-draft-questionnaire-framework.pdf.
Klieme, E., Schümer, G., & Knoll, S (2001) Mathematikunterricht in der Sekundarstufe I: “Aufgabenkultur” und Unterrichtsgestaltung. In BMBF (Ed), TIMSS—Impulse für Schule und Unterricht, Forschungsbefunde, Reforminitiativen, Praxisberichte und Video-Dokumente. Bonn: Bmbf; 43–58.
Klieme, E., Pauli, C., & Reusser, K. (2009). The Pythagoras study: Investigating effects of teaching and learning in Swiss and German mathematics classrooms. The power of video studies in investigating teaching and learning in the classroom (pp. 137–160). Waxmann: Munster.
Klusmann, U., Kunter, M., Trautwein, U., Lüdtke, O., & Baumert, J. (2008). Engagement and emotional exhaustion in teachers: Does the school context make a difference? Applied Psychology, 57, 127–151.
Krane, V., Ness, O., Holter-Sorensen, N., Karlsson, B., & Binder, P. E. (2017). ‘You notice that there is something positive about going to school’: How teachers’ kindness can promote positive teacher–student relationships in upper secondary school. International Journal of Adolescence and Youth, 22(4), 377–389.
Kratz, H. E. (1896). Characteristics of the best teacher as recognized by children. The Pedagogical Seminary, 3(3), 413–460.
Kunter, M., Baumert, J., & Köller, O. (2007). Effective classroom management and the development of subject-related interest. Learning and Instruction, 17(5), 494–509.
Kunter, M., Klusmann, U., Baumert, J., Richter, D., Voss, T., & Hachfeld, A. (2013). Professional competence of teachers: Effects on instructional quality and student development. Journal of Educational Psychology, 105(3), 805–820. https://doi.org/10.1037/a0032583
Kyriakides, L., & Creemers, B. P. (2011). Can schools achieve both quality and equity? Investigating the two dimensions of educational effectiveness. Journal of Education for Students Placed at Risk, 16(4), 237–254.
Kyriakides, L., Creemers, B. P., & Antoniou, P. (2009). Teacher behaviour and student outcomes: Suggestions for research on teacher training and professional development. Teaching and Teacher Education, 25(1), 12–23.
Lanahan, L., McGrath, D. J., McLaughlin, M., Burian-Fitzgerald, M., & Salganik, L. (2005). Fundamental problems in the measurement of instructional processes: Estimating reasonable effect sizes and conceptualizing what is important to measure. Washington, DC: American Institutes for Research.
Lau, K. C., & Ho, S. C. E. (2020). Attitudes towards science, teaching practices, and science performance in PISA 2015: Multilevel analysis of the Chinese and Western top performers. Research in Science Education. https://doi.org/10.1007/s11165-020-09954-6.
Liou, P. Y., Wang, C. L., Lin, J. J., & Areepattamannil, S. (2020). Assessing students’ motivational beliefs about learning science across grade level and gender. The Journal of Experimental Education, 89(4), 605–24.
Lipko-Speed, A., Dunlosky, J., & Rawson, K. A. (2014). Does testing with feedback help grade-school children learn key concepts in science? Journal of Applied Research in Memory and Cognition, 3(3), 171–176.
Lipowsky, F., Rakoczy, K., Pauli, C., Drollinger-Vetter, B., Klieme, E., & Reusser, K. (2009). Quality of geometry instruction and its short-term impact on students’ understanding of the Pythagorean Theorem. Learning & Instruction, 19, 527–537. https://doi.org/10.1016/j.learninstruc.2008.11.001.
Liu, H., Van Damme, J., Gielen, S., & Van Den Noortgate, W. (2015). School processes mediate school compositional effects: Model specification and estimation. British Educational Research Journal, 41(3), 423–447.
Ma, X., & Willms, J. D. (2004). School disciplinary climate: Characteristics and effects on eighth grade achievement. Alberta Journal of Educational Research. https://doi.org/10.11575/ajer.v50i2.55054.
Marsh, H. W., Lüdtke, O., Nagengast, B., Trautwein, U., Morin, A. J. S., & Abduljabbar, A. S. (2012). Classroom climate and contextual effects: Conceptual and methodological issues in the evaluation of group-level effects. Educational Psychology, 47, 106–124. https://doi.org/10.1080/00461520.2012.670488.
Martin, M. O., Mullis, I. V. S., Foy, P., & Hooper, M. (2016). TIMSS 2015 International Results in Science. Boston College, TIMSS & PIRLS International Study Center website: http://timssandpirls.bc.edu/timss2015/international-results/.
Mullis, I. V. S., Martin, M. O., Foy, P., Kelly, D. L., & Fishbein, B. (2020). TIMSS 2019 International Results in Mathematics and Science. Retrieved from Boston College, TIMSS & PIRLS International Study Center. https://timssandpirls.bc.edu/timss2019/international-results/.
Muthén, B., & Muthén, L. (1998–2014). Mplus Version 8.3 [Statistical software package]. Muthén & Muthén.
Nilsen, T., & Gustafsson, J. E. (2014). School emphasis on academic success: Exploring changes in science performance in Norway between 2007 and 2011 employing two-level SEM. Educational Research and Evaluation, 20(4), 308–327.
Nilsen, T., & Gustafsson, J. E. (2016). Teacher quality, instructional quality and student outcomes: Relationships across countries, cohorts and time (p. 166). Cham: Springer.
Ning, B., Van Damme, J., Van Den Noortgate, W., Yang, X., & Gielen, S. (2015). The influence of classroom disciplinary climate of schools on reading achievement: A cross-country comparative study. School Effectiveness and School Improvement, 26(4), 586–611.
OECD. (2013). PISA 2012 results: Excellence through equity—giving every student the chance to succeed (vol. II). PISA: OECD Publishing. https://doi.org/10.1787/9789264201132-en.
OECD (2016a), PISA 2015 Results (Volume I): Excellence and Equity in Education, PISA, OECD Publishing, Paris, https://doi.org/10.1787/9789264266490-en.
OECD (2016b), PISA 2015 Results (Volume II): Policies and Practices for Successful Schools, PISA, OECD Publishing, Paris, https://doi.org/10.1787/9789264267510-en.
OECD. (2017a). PISA 2015 results (vol.III). Students’ well-being. PISA: OECD Publishing. https://doi.org/10.1787/9789264273856-en.
OECD (2017b). PISA 2015 technical report. OECD Publishing. https://www.oecd.org/pisa/data/2015-technical-report/.
OECD. (2019). PISA 2018 results (vol. III): What school life means for students’ lives. PISA: OECD Publishing. https://doi.org/10.1787/acd78851-en.
Pianta, R. C., & Allen, J. P. (2008). Building capacity for positive youth development in secondary school classrooms: Changing teachers’ interactions with students. In M. Shinn & H. Yoshikawa (Eds.), Toward positive youth development: Transforming schools and community programs (pp. 21–39). Oxford University Press.
Pianta, R. C., & Hamre, B. K. (2009). Conceptualization, measurement, and improvement of classroom processes: Standardized observation can leverage capacity. Educational Researcher, 38(2), 109–119.
Pianta, R., La Paro, K., & Hamre, B. K. (2007). Classroom assessment scoring system. Baltimore: Paul H. Brookes.
Pitzer, J., & Skinner, E. (2017). Predictors of changes in students’ motivational resilience over the school year: The roles of teacher support, self-appraisals, and emotional reactivity. International Journal of Behavioral Development, 41(1), 15–29.
Praetorius, A. K., Pauli, C., Reusser, K., Rakoczy, K., & Klieme, E. (2014). One lesson is all you need? Stability of instructional quality across lessons. Learning and Instruction, 31, 2–12.
Praetorius, A.-K., Klieme, E., Herbert, B., & Pinger, P. (2018). Generic dimensions of teaching quality: The German framework of three basic dimensions. ZDM, 50(3), 407–426.
Rudasill, K. M., Snyder, K. E., Levinson, H., & Adelson, J. L. (2018). Systems view of school climate: A theoretical framework for research. Educational Psychology Review, 30(1), 35–60.
Scherer, R., & Nilsen, T. (2016). The relations among school climate, instructional quality, and achievement motivation in mathematics. In T. Nilsen & J. E. Gustafsson (Eds.), Teacher quality, instructional quality and student outcomes (pp. 51–80). Amsterdam: International Association for the Evaluation of Educational Achievement (IEA), SpringerOpen.
Scherer, R., Nilsen, T., & Jansen, M. (2016). Evaluating individual students’ perceptions of instructional quality: An investigation of their factor structure, measurement invariance, and relations to educational outcomes. Frontiers in Psychology, 7, 110.
Seidel, T., & Shavelson, R. (2007). Teaching effectiveness research in the past decade: The role of theory and research design in disentangling metaanalysis results. Review of Educational Research, 77, 454–499. https://doi.org/10.3102/0034654307310317.
Sirin, S. R. (2005). Socioeconomic status and academic achievement: A meta-analytic review of research. Review of Educational Research, 75(3), 417–453.
Snijders, T. A., & Bosker, R. J. (2011). Multilevel analysis: An introduction to basic and advanced multilevel modelling. Thousand Oaks: Sage.
Sortkær, B. (2018). Feedback for everybody? Variations in students’ perception of feedback. Northern Lights on TIMSS and PISA, 2018, 161.
Sortkær, B., & Reimer, D. (2016). Disciplinary climate and student achievement: Evidence from schools and classrooms. Danish School of Education, Aarhus Univeristy.
Sortkær, B., & Reimer, D. (2018). Classroom class discipline of schools and gender—evidence from the Nordic countries. School Effectiveness and School Improvement, 29(4), 511–528.
Sun, L., Bradley, K. D., & Akers, K. (2012). A multilevel modelling approach to investigating factors impacting science achievement for secondary school students: PISA Hong Kong sample. International Journal of Science Education, 34(14), 2107–2125. https://doi.org/10.1080/09500693.2012.708063.
Taut, S., & Rakoczy, K. (2016). Observing instructional quality in the context of school evaluation. Learning and Instruction, 46, 45–60.
Tosto, M. G., Asbury, K., Mazzocco, M. M., Petrill, S. A., & Kovas, Y. (2016). From classroom environment to mathematics achievement: The mediating role of self-perceived ability and subject interest. Learning and Individual Differences, 50, 260–269.
Uline, C., & Tschannen-Moran, M. (2008). The walls speak: The interplay of quality facilities, school climate, and student achievement. Journal of Educational Administration, 46(1), 55–73.
Van den Broeck, L., Demanet, J., & Van Houtte, M. (2020). The forgotten role of teachers in students’ educational aspirations. School composition effects and the buffering capacity of teachers’ expectations culture. Teaching and Teacher Education, 90, 103015.
Van Tartwijk, J., & Hammerness, K. (2011). The neglected role of classroom management in teacher education. Teaching Education, 22, 109–112. https://doi.org/10.1080/10476210.2001.567836.
Vieluf, S. (2012). Teaching practices and pedagogical innovations: Evidence from TALIS. Paris: OECD Publishing. https://doi.org/10.1787/9789264123540.
Volckmar, N. (2019). The enduring quest for equity in education: Comparing Norway and Australia. Scandinavian Journal of Educational Research, 63(4), 617–631.
Wagner, W., Göllner, R., Werth, S., Voss, T., Schmitz, B., & Trautwein, U. (2016). Student and teacher ratings of instructional quality: Consistency of ratings over time, agreement, and predictive power. Journal of Educational Psychology, 108(5), 705.
Wang, M. T., Degol, J. L., Amemiya, J., Parr, A., & Guo, J. (2020). Classroom climate and children’s academic and psychological wellbeing: A systematic review and meta-analysis. Developmental Review, 57, 100912.
Way, N., Reddy, R., & Rhodes, J. (2007). Students’ perceptions of school climate during the middle school years: Associations with trajectories of psychological and behavioural adjustment. American Journal of Community Psychology, 40, 194–213.
Weaver, M. R. (2006). Do students value feedback? Student perceptions of tutors’ written responses. Assessment & Evaluation in Higher Education, 31(3), 379–394.
Welsh, W. N., Stokes, R., & Greene, J. R. (2000). A macro-level model of school disorder. Journal of Research in Crime and Delinquency, 37(3), 243–283.
Wentzel, K. R., Muenks, K., McNeish, D., & Russell, S. (2018). Emotional support, social goals, and classroom behavior: A multilevel, multisite study. Journal of Educational Psychology, 110(5), 611.
Wong, T. K., Tao, X., & Konishi, C. (2018). Teacher support in learning: Instrumental and appraisal support in relation to math achievement. Issues in Educational Research, 28(1), 202–219.
Yang Hansen, K., & Gustafsson, J. E. (2016). Causes of educational segregation in Sweden—school choice or residential segregation. Educational Research and Evaluation, 22(1–2), 23–44.
Yetişir, M. İ, & Kaan, B. A. T. I. (2021). The effect of school and student-related factors on PISA 2015 science performances in Turkey. International Journal of Psychology and Educational Studies, 8(2), 170–186.
Yıldırım, S. (2012). Teacher support, motivation, learning strategy use, and achievement: A multilevel mediation model. The Journal of Experimental Education, 80(2), 150–172.
The authors would like to thank the Norwegian 2015 PISA group providing and preparing the data for the present study and the OECD for the data for the present study.
We confirm that this manuscript is original, and has not yet been published elsewhere. It is not under concurrent consideration elsewhere. All three authors have approved the manuscript, and agree with its submission to the special issue in Large-scale Assessments in education. Furthermore, the authors accept the copyright information and author’s rights. There are no conflicting interests.
No additional funding was associated with this research.
No financial interests or benefits have arisen from direct application of this research. Furthermore, the authors accept the copyright information and author’s rights. There are no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Rohatgi, A., Hatlevik, O.E. & Björnsson, J.K. Supportive climates and science achievement in the Nordic countries: lessons learned from the 2015 PISA study. Large-scale Assess Educ 10, 12 (2022). https://doi.org/10.1186/s40536-022-00123-x