Skip to main content

An IERI – International Educational Research Institute Journal

Context factors and student achievement in the IEA studies: evidence from TIMSS

Abstract

Background

The present study investigates what factors related to the school context influence student achievement on TIMSS mathematics tests across countries. A systematic review of the literature on PIRLS, TIMSS, and ICCS was conducted upstream to identify those school, teacher, and classroom factors shown to be useful predictors of student performance in previous IEA studies. Data of student samples representative of grade 8 students from 28 countries who participated in TIMSS 2011 were analysed. The main aim of the present study is to verify what school and teacher characteristics are positively associated with students’ mathematics achievement, mainly focusing on disadvantaged schools. Furthermore, it aims at identifying how school context variables contribute to explaining the performance of students in disadvantaged schools in comparison with more advantaged schools.

Methods

A separate analysis was carried out for each considered country, and the same multilevel regression model was used on the sampled schools as a whole and treating schools with high (highest tertile) and low (lowest tertile) socio-economic backgrounds as distinct groups.

Results

The results confirmed that a high socio-economic status has a significant and positive effect on student achievement: compared with students from socio-economic disadvantaged schools, students from advantaged schools performed better in mathematics achievement. This difference is more evident in countries where the gap between rich and poor people as measured by the Gini coefficient, which measures how much an economy deviates from perfect equality, is wider. However, this difference is restricted in countries with a smaller gap between rich and poor people.

Conclusions

According to the literature in the field, the results show significant differences across countries in relation to the school and teacher characteristics that have an impact on mathematics achievement of students from low and high SES schools. Different patterns were also found within countries for low and high SES schools.

Background

A review of the literature on how school-level factors were, and still are, used to predict and interpret student performance in some of the IEA studies—i.e., PIRLS, TIMSS, and ICCS—showed that it is quite difficult to find strong associations between school-level factors and students’ results at a country level, based on data of the type collected in comparative studies, and that wide differences are found across participating countries. Such results suggest the need for more in-depth analyses. The multi-level model for an analysis of TIMSS 2011 results presented below—intended to explore whether the impact of those factors on student achievement varies in relation to the schools’ socio-economic status—originally stemmed from that review.

School-level variables in the IEA studies

From the beginning, IEA international comparative studies have included background questionnaires to collect data on contextual factors to be linked to students’ cognitive outcomes (Postlethwaite 1967; Peaker 1975; Walker 1976). Participating students, their teachers and the principals of their schools respond to background questionnaires. These questionnaires are central to the analysis of results in terms of a range of student and school characteristics—from students’ economic, social and cultural capital to context of instruction, including schools’ human and material resources, as well as school and classroom conditions and processes (IEA 1998). The school-level variables used for the development of the questionnaire address school context (e.g., urban/rural, community resources), school characteristics (e.g., school type, school size, and instructional time), school resources (e.g., teaching materials and equipment), school initiatives in the field of specific interest to each survey, school management (e.g., funding, decision-making processes, staffing practices and teacher evaluation, curricular emphasis, and parental involvement), teachers’ characteristics and activities, teaching practices, and classroom activities (Postlethwaite and Ross 1992).

Data collected through background questionnaires are central to both international reports and secondary analyses, as they allow to better contextualise student results in the cognitive tests and help identify the school and classroom factors that have a direct or indirect impact on student performance.

International reports show that the school variables associated with student achievement are mainly those related to school context, school characteristics and resources and class characteristics (Walker 1976; Postlethwaite and Ross 1992; Schulz et al. 2010; Mullis et al. 2012a, b). CIVED and ICCS 2009 found that an open classroom climate for discussion was a positive predictor of student civic knowledge in several countries, although data about that particular school characteristic were collected through the student questionnaire and not the school questionnaire (Schulz et al. 2010; Torney-Purta et al. 2001).

However, it is exceedingly difficult to draw causal inferences, such as concluding that a particular process-related school variable is directly associated with student achievement, and wide differences across and within countries are usually reported.

Several studies indicated that teacher-related factors have effects of various magnitudes on student achievement in mathematics (e.g., Rivkin et al. 2005; Akiba et al. 2007; Akyüz and Berberoglu 2010; Tsai and Yang 2015; Winnaar et al. 2015). For example, De Witte and Van Klaveren (2014), using data from Dutch students participating in TIMSS 2003, found that high test scores are associated with teaching styles that emphasize problem solving and homework.

Furthermore, several studies used the TIMSS data for secondary analyses at individual country level.

Lee and Huh (2014) investigated the impact of teachers’ instructional strategies on student learning in mathematics and what instructional strategies are positively associated with student learning outcome in the United States. Eighth grade students’ mathematics data from TIMSS 2007 were used. Teachers’ instructional strategies were found to explain approximately 17 % at the teacher level of the learning outcome.

Nilsen and Gustafsson (2014) showed that school emphasis on academic success contributed to explaining the increased science performance in Norway between TIMSS 2007 and 2011.

However, other studies found limited evidence of this impact on student achievement in both mathematics (Luschei and Chudgar 2011; Sturman and Lin 2011; Dodeen and Hilal 2012) and reading literacy (e.g., Van Daal et al. 2006; Schagen and Twist 2008). Recently, Gao (2014) investigated whether inquiry-based instruction is more effective in influencing student science achievement than traditional teaching methods. Using the TIMSS 2011 8th grade dataset from Singapore, Chinese Taipei and the US, the author did not find a clear and positive relation between the type of instruction and student achievement.

Moreover, research has still not been able to clearly identify a pool of teacher characteristics and classroom practices that consistently improve student learning across countries (Goe 2007). In a recent review of TIMSS research literature, Drent et al. (2013) outlined the existence of wide differences across countries. Phan (2008) found that different variables are associated with mathematics achievement in different countries. In her studies, she tested three different models: (1) the instructional practices model, (2) the teacher background model and (3) a full model, including all of the variables considered in the previous models. The author found that the first model was the best for the United States; the second model served as the most efficient for predicting math achievement in Egypt, and the third model performed the best in Canada and South Africa.

Martin et al. (2000) conducted an analysis on school effectiveness using TIMSS 1995 data in countries with a large between-school variance. The authors identified several characteristics that distinguished low- from high-achieving schools and then examined school factors using hierarchical linear modelling. After controlling for student socio-economic status, the factors considered did not show a strong relationship with mathematics achievement across all countries. Only a few school characteristics, such as school climate, instructional activities and teacher characteristics were found to be associated with school achievement in some countries.

Recently, a multilevel model related to TIMSS and PIRLS 2011 was developed by Martin et al. (2013) to verify what characteristics of effective schools and of specific home backgrounds are associated with higher student achievement in reading, mathematics, and science at grade 4. The results showed that the ‘Home Resources for Learning’ variable was the strongest predictor of student achievement, with significant effects on both between and within school variance in almost every country. After controlling for the ‘Home Background’ variable, the strength of the relationship between school environment and instruction and student achievement was considerably reduced across countries. The variable ‘Schools Are Safe and Orderly’ maintained a significant effect in at least one subject after controlling for ‘Home Resources’ in 15 countries, while ‘Schools Support Academic Success’ had a positive impact on student achievement in at least one subject in 10 countries. In general, this study found considerable differences across countries in the ways school variables are related to student achievement, with similar results for reading, mathematics and science.

Analyses conducted on ICCS 2009 data have shown a limited impact of school factors on student civic knowledge, except for the open classroom climate (Fraillon et al. 2011). Attempts have also been made to explore the relationships between school factors and non-cognitive students’ outcomes for ICCS 2009. Controlling for student and school SES, the impact of school variables was shown to be negligible (Caponera and Losito 2011; Caponera et al. 2012).

Apparently, these results confirm that SES is the most important contextual factor affecting student learning outcomes (Coleman et al. 1966; Coleman 1975; OECD 2005).

The literature review carried out for this study showed a relevant impact of SES at the school level: the analysis of the influence of student socio-economic status on student achievement seemed to confirm that variables such as parent educational level and the amount of resources available at home are strongly associated with student achievement (Chiu and Xihua 2008; Ismail and Awang 2008). In some studies, an index of the overall socio-economic status for each individual school was calculated as an average of student socio-economic background, and different studies showed the relevance of this variable in explaining student achievement (e.g., McConney and Perry 2010). Furthermore, in several studies, only few variables were found to have a significant effect on mathematics achievement once the socioeconomic level of schools and students was taken into account (Wiberg et al. 2013; Wiberg and Rolfsman 2013). Recently, some studies tried to better understand whether and the extent to which school variables contribute to improving student achievement in disadvantaged schools (Baird 2008; Shepherd 2013).

Sandoval-Hernández et al. (2014) used data from TIMMS 2011 (fourth grade) to compute an HLM analysis using school factors for two sub-samples: disadvantaged students and non-disadvantaged students. They compared the results across ten European countries and found that the school factors considered in the analysis have a stronger relation with non-disadvantaged student achievement in most countries.

Based on the review of the existing literature on IEA studies’ results and with particular attention to secondary analyses derived from the TIMSS Survey, the present study aims at investigating whether and the extent to which those results may help to clarify what factors related to the school context influence student achievement on the TIMSS mathematics tests across countries. On the basis of the evidence discussed so far, the aim of the present study is to verify what school and teacher characteristics are positively associated with students’ mathematics achievement, mainly focusing on disadvantaged schools. Furthermore, it aims at identifying how school context variables contribute to explaining the performance of students in disadvantaged schools in comparison with more advantaged schools.

A separate analysis was carried out for each considered country, and the same multi-level regression model was used on the sampled schools as a whole and on schools with high (highest tertile) and low (lowest tertile) socio-economic backgrounds as distinct groups. To verify whether the impact of specific school and teacher characteristics may be different in relation to the schools’ socio-economic characteristics (average student SES at school level), a hierarchical multilevel analysis of TIMSS 2011 data was conducted. The multilevel approach adopted allows for data analysis with a hierarchical structure—where the individual units (students) are “nested” within the aggregated school level. Thus, this technique makes it possible to investigate simultaneously variables measured at student level (the first level of the multilevel analysis) and the impact of some relevant school characteristics on student achievement (second level of the multilevel analysis).

Methods

Participants

The analyses presented in this paper were conducted on the TIMSS 2011 data for Grade 8 students. TIMSS used a two-stage sampling design (for a detailed description, see Martin and Mullis 2012). The overall sample consisted of 149,788 students from all participating countries, from 5177 schools.

Countries with poor reliabilityFootnote 1 on the achievement scale were excluded from our analyses, as well as cases with missing values in one or more explanatory variables.

Measures

For the sake of brevity, only the measures that are directly relevant to the study’s aims and hypotheses will be described (for a detailed description see Martin and Mullis 2012).

Mathematics achievement scale. The scale was developed by the research group of the TIMSS project (for a detailed description of the scale, see Martin and Mullis 2012) and consisted of multiple-choice and constructed-response items. The eighth grade mathematics content domains included number, algebra, geometry, and data and chance. The cognitive domains measured were knowing, applying and reasoning. The whole item pool consisted of 217 questions. In TIMSS 2011, various combinations of the assessment items were compiled into 14 booklets while maintaining the distribution of items across content and cognitive domains. Using IRT estimates, a score of mathematics achievement was calculated for each student. The scale used to measure mathematics achievement has high internal consistency (Cronbach’s alpha international median 0.87). To take into account measurement errors, a range of five plausible values of scores in mathematics test was provided for each student (for a detailed description, see Martin and Mullis 2012). In the present study, the proficiency score for overall mathematics achievement drawn from the five plausible values obtained through the IRT methodology was used in the analyses.

The following variables derived from student, teacher and school questionnaires were used in the analyses; they are found in the literature to be good indicators of student performance in mathematics. All of the scales were constructed using IRT scaling methods, specifically the Rasch partial credit model. Using IRT partial credit scaling, student responses were placed on a scale constructed so that the mean scale score across all countries was 10, and the standard deviation was 2 (for detailed description, see Martin and Mullis 2012).

At the student level, we used the following variables derived from the Student Questionnaire and included in the TIMSS database.

Socio-economic status (SES; indicated in the International report as Home educational resources). Based on the answers in the Student Questionnaire (for a detailed description of these scales, see Martin and Mullis 2012), a general index of each student’s socio-economic status was created. The Student Questionnaire collected information about (1) student home environments, including the parents’ educational level, (2) the number of resources for study available at home, and 3) how many books there are in the home.

Mathematics self-concept (St_SCM). Students were asked to answer nine questions related to their perceived ability to study and learn mathematics, such as “I learn things quickly in mathematics”.

Students like learning mathematics (St_SLM). The scale consisted of five questions concerning the student’s interest and positive attitude towards mathematics, such as “I enjoy learning mathematics”.

Students value mathematics (St_SVM). Students were asked to answer to six questions concerning the importance of studying mathematics to their lives, e.g., “I need to do well in mathematics to get the job I want”.

Students engaged in mathematics lessons (St_EML). The scale was created based on students’ level of agreement with five statements, such as “My teacher gives me interesting things to do”.

Weekly time spent on math homework (St_HMW). The TIMSS 2011 teacher questionnaire collected information about the teachers of the students participating in the assessment. For the schools where more than one teacher per class or per school were selected, we computed the mean of the different scores. The following variables derived from the teacher questionnaire were used for the school level.

Safe and orderly school (Teach_SOS). The scale was created based on teachers’ degree of agreement with five statements regarding a disciplined climate in their schools.

Teacher working conditions (Teach_WCN). The scale was created based on teachers’ responses to five questions, such as “In your current school, how severe is each problem? Teachers do not have adequate workspace (e.g., for preparation, collaboration, or meeting with students)”.

School emphasis on academic success (Teach_EAS).

Confidence in teaching mathematics (Teach_CTM). The scale was created based on teachers’ responses to five questions, such as “In teaching mathematics to this class, how confident do you feel to adapt teaching to engage students’ interest?”

Teacher career satisfaction (Teach_CST). The scale was created based on teachers’ degree of agreement with six statements, such as “I am satisfied with being a teacher at this school”.

Collaborate to improve teaching (Teach_CIT). The scale was created based on teachers’ responses to five questions concerning the types of interactions with other teachers, such as “Collaborate in planning and preparing instructional materials”.

Instruction to engage students in learning (Teach_IES). The scale was created based on teachers’ responses to how often they used each of four instructional practices, such us “Use questioning to elicit reasons and explanations”.

The variables derived from the school questionnaire and used for the school level are the following.

Instruction affected by mathematics resource shortages (Princ_MRS). The scale was created based on principals’ responses concerning the availability of resources both at school and classroom levels, such as “How much is your school’s capacity to provide instruction affected by a shortage or inadequacy of calculators for mathematics instruction?”

School emphasis on academic successprincipal reports (Princ_EAS). The scale was created based on principals’ responses (e.g., “How would you characterize each of the following within your school?” Teachers’ degree of success in implementing the school’s curriculum).

School discipline and safety (Princ_DAS). Principals were asked to answer 11 questions regarding different discipline problems among eighth grade students at school, such as “Physical injury to other students”.

All of the variables described above, derived from both teacher and school questionnaires, are included in the TIMSS database.

In addition, the following scales were constructed ad hoc.

SES_school. The index was calculated at the school level and corresponds to the students’ SES average in each school. This index was used at the school level to select students attending schools with low socio-economic status and students attending schools with high socio-economic status.

Sum of topic taught (Teach_STT). The scale was constructed to measure how much teachers taught 19 different topics related to mathematics during the year of the test.

Teacher preparedness (Teach_PRP). A second-order factor analysis was conducted on four indices measuring teacher beliefs regarding their preparation to teach number, algebra, geometry and data. The new index explains 69 % of the variance and has good internal consistency (0.76 median Cronbach alpha across countries).

Parental involvement (Princ_PIN). A factor analysis was conducted on 10 items derived from the school questionnaire, such as “Inform parents about the behaviour and well-being of their child at school”. The index explains 69 % of the variance and has good internal consistency (0.76 median Cronbach alpha across countries).

Data analysis

The country-specific descriptive analyses were conducted using the software IEA IDBFootnote 2 Analyzer—a software developed by the IEA Data Processing and Research Center for analysing data from all IEA surveys—by means of adapted macros provided by IEA TIMSS (Foy et al. 2013).

A two level hierarchical linear model was conducted by means of the software HLM 6.0 (Raudenbush et al. 2004a, b) to investigate the relationship between the school variables and student mathematics achievement, accounting for the socio-economic index. In HLM, the analysis of plausible values is done by multiple imputations.

At level 1—student level—we used the scales described earlier and derived from the student questionnaire; at level 2—school level—we used variables from both the teacher and school questionnaires described above.

In the multilevel analyses, the house weight (HOUSWGT) was used. The variable total student weight was normalized so that the sum of the weights was equal to the student sample size in the data. A proficiency score for overall mathematics achievement drawn from all five plausible values was used as dependent variable. The independent student-level variables were entered as group-mean centred student-level (level 1) variables; the independent school-level averages were entered as grand-mean centred school-level (level 2) variables.

Results and discussion

Descriptive statistics

Table 1 shows the description of participantsFootnote 3 in TMSS 2011 divided by class and by school.

Table 1 Descriptive statistics: number of participants

Depending on the average class size in the country, one class from each sampled school may be sufficient to achieve the desired student sample size. Some countries choose to sample more than one class per school, either to increase the size of the student sample or to provide a better estimate of school-level effects (for a more detailed description see Martin and Mullis 2012). As shown in Table 1, in most countries, only one class per school was selected.

Table 2 shows the descriptive statistics for mathematics across countries. The results are presented by country in alphabetic order.

Table 2 Descriptive statistics: mean and s.e. for mathematics achievement

Concerning mathematics achievement, the difference between students from schools with high and low socio-economic backgrounds varies across countries, from 26 in Slovenia to 138 in Malaysia.

Slovenia, Sweden, Norway and Finland showed less variation, whereas Malaysia, Australia, Chile, England, Honk Kong, SAR, Israel, Romania and Turkey showed a marked difference of more than one standard deviation.

Multilevel analyses

The TIMSS 2011 data were best described in two levels: student level (level-1), and school and teacher level (level-2). Level-1 was represented by student background and home resource variables that were unique across students and also by student characteristics such as self-efficacy and interest in mathematics. Level-2 was represented by instructional practices, teacher background and school background variables because each school had one mathematics class sampled. The analysis took place in three steps:

  1. 1.

    First, the variance between and within schools in relation to mathematics achievement was estimated (model 0, with no explanatory variables). This model provides estimates of the variance at each level (within and between schools) and is the reference point to determine how much variance is explained by subsequent models.

  2. 2.

    Second, the model was modified by introducing student-level variables (model 1, where the effects on student level were treated as fixed, assuming no variation across schools).

  3. 3.

    The following steps consisted of introducing school-level and class-level variables into the model (model 2). The model was then completed by adding the school’s average index of socio-economic background (model 3).

Table 3 shows the total variance, the between variance and the percentage of variance explained by the full model divided by all students, students from schools with low SES and students from schools with high SES. In Table 4, the full model is presented.

Table 3 Total variance and explained variance in mathematics achievement across countries
Table 4 Multilevel regression results: estimated effects on TIMSS mathematics achievement—school level

As expected, after removing the influence of SES at school level, the between-school variance appeared reduced, except for Slovenia (as far as low SES schools were concerned) and Chile, Georgia, Japan, Republic of Korea, Thailand, Tunisia, Turkey, United Arab Emirates and the United States (as far as high SES schools were concerned).

School emphasis on academic successprincipal reports is positively related to student achievement in 7 out of 28 countries (Georgia, Hungary, Australia, Finland, Lebanon, Lithuania, Republic of Korea).

Parental involvement is negatively related to student achievement in low SES schools in 4 out of 28 countries: Georgia, Hungary, New Zealand, and Malaysia.

Teacher preparedness was found to be positively associated with achievement in low SES schools in 5 countries (Turkey, Hong Kong, Hungary, New Zealand, Malaysia). Instruction to engage students in learning was found to be positively associated with achievement in low SES schools in 4 countries (Hungary, Republic of Korea, New Zealand, Tunisia) and negatively associated with it in 2 countries (Italy and Lithuania). Confidence in teaching mathematics was found to be positively associated with achievement in disadvantaged schools in 4 countries (England, Lebanon, Lithuania, Norway) and negatively in one country (Hungary).

The results exhibited a different impact of the considered variables on the analyses of different countries. For example, in New Zealand, Thailand and Ukraine, the full model of high SES schools explained a higher proportion of between-school variance, while parental involvement, school emphasis on academic success (principal reports) and safe and orderly school (teacher report) had a significant impact on mathematics achievement compared with low SES schools.

Conclusion

The main aim of the present study was to evaluate the impact of context factors on mathematics achievement, focusing on students attending socio-economic disadvantaged schools across countries participating in TIMSS 2011. Context factors reflecting the availability/non-availability of economic and cultural resources within the family context play a relevant role in determining student performance. As expected and according to previous studies (see, e.g., Sirin 2005; Chiu and Xihua 2008; Ismail and Awang 2008; Levpušček et al. 2013), our analyses showed that a high socio-economic status has a significant and positive effect on student achievement: compared with students from socio-economically disadvantaged schools, students from advantaged schools performed better in mathematics achievement. This difference is more evident in countries where the gap between rich and poor people as measured by the Gini coefficient, which measures how much an economy deviates from perfect equality, is wider.Footnote 4 Those countries are Chile, England, Turkey, Malaysia and Israel. However, this difference is restricted in countries with the smallest gap between rich and poor people, namely in northern European countries, such as Finland, Norway and Sweden, and in Slovenia. Two exceptions are represented by Japan and Australia. In Japan, although it has a large gap between rich and poor, the difference in mathematics achievement is about half of a standard deviation. In Australia, the gap between rich and poor people is not so high, but the difference in mathematics achievement is around one standard deviation.

Moreover, according to the literature (Martin and Mullis 2013; Drent et al. 2013; Phan 2008), the results show significant differences across countries in relation to the school and teacher characteristics that have an impact on mathematics achievement of students from low and high SES schools. Different patterns were also found within countries for low and high SES schools.

As for other variables considered in this study, such as parental involvement, other studies showed discordant results (e.g., McNeal 1999). It is possible that this difference is due to schools’ attempts to have more direct and stronger relationships with the parents of students with learning problems. Furthermore, in advantaged schools, parental involvement and emphasis on academic success have an impact in a small group of countries, although this association has different directions in different countries.

Some limitations to this study should be noted. First, to gain a deeper understanding of the present findings, it is necessary to consider the large differences in teaching mathematics across countries (Mullis et al. 2012). A second limitation is that the data used in this study are related to only one school year. As a consequence, generalizations about the influence of context factors on student mathematics achievement should be taken with some degree of caution. Analyses on more than one dataset are needed for a clearer picture of what school factors are associated with mathematics achievement.

Additionally, a more general methodological issue should be investigated. The type of constructs and variables used in the questionnaire development, the way these constructs and variables are operationally defined, the self-reported nature of the collected data, and the type of data analyses carried out all may contribute to explaining the difficulty of finding strong associations between students’ performance and process-related school variables. Additionally, the construction of indicators of teaching processes may require a different approach, namely a systematic observation of classroom practices (Postlethwaite and Ross 1992). Furthermore, the results suggest that even more advanced and “complex” analytical methods and designs of analysis should be tried to address some of the issues outlined here, in an effort to extricate the effect of SES from that of other variables.

Despite these limitations, the present study investigated the relationship between context factors and student achievement across countries on a large and representative sample of students, assessed by a well-established international standardized test. It should be noted that such standardized tests have been used increasingly in recent years by educational and political decision makers to improve teaching and learning in mathematics and the quality of education systems. The results of the present study suggest the opportuneness of using school-level factors not only for cross-country comparisons but also for an in-depth investigation into the differences existing within individual countries. Moreover, as already shown by other studies (Sandoval-Hernández et al. 2014), the results of the international comparative studies could be used to study the impact of school factors in disparate school contexts. Different school factors seem to play a different role in different school contexts within each individual country.

Notes

  1. In TIMSS 2011 International Results in Mathematics, students were considered to have an achievement too low for estimation if their performance on the assessment was no better than that they could achieve by simply guessing on the multiple choice assessment items. However, such students were assigned scale scores (plausible values) by the achievement scaling procedure, despite concerns about their reliability. We excluded countries with reservations about reliability of average achievement because the percentage of students with achievement too low for estimation exceeds 15 % (Mullis et al. 2012a, p 456).

  2. The IDB Analyzer allows handling complex sample designs, using plausible value methodology and calculating correct standard errors when conducting analysis with large-scale surveys. The IDB Analyzer creates an SPSS code that can be used to conduct statistical analysis considering the complex sample and assessment structures of these databases. The software allows combining data from different countries for cross-country analysis and selecting specific subsets of variables. In addition, it provides several different procedures for analysis, such as the computation of means or percentages of any background variable of interest for a whole country or subgroup within a population (IEA 2012).

  3. In this study, we only included the countries that had an adequate reliability concerning average achievement in mathematics and where all of the items used in the analyses were administered.

  4. See https://www.cia.gov/library/publications/the-world-factbook/rankorder/2172rank.html.

References

  • Akiba, M., LeTendre, G. K., & Scribner, J. P. (2007). Teacher quality, opportunity gap, and national achievement in 46 countries. Educational Researcher, 36(7), 369–387.

    Article  Google Scholar 

  • Akyüz, G., & Berberoglu, G. (2010). Teacher and Classroom characteristics and their relation to mathematics achievement of the student in the TIMSS. New Horizons in Education, 58(1), 77–95.

    Google Scholar 

  • Baird, K. (2008). An international investigation into the relationship between school resources, policy, and math performance among low socioeconomic status students. Paper presented at 3rd IEA International Research Conference 18–20 September, Taipei, Chinese Taipei. http://www.iea.nl/fileadmin/user_upload/IRC/IRC_2008/Papers/IRC2008_Baird.pdf. Accessed 16 Sept 2015.

  • Caponera, E., Losito, B. (2011). The roles of schools and communities in civic and citizenships education. Paper presented at AERA Annual Meeting, 8–18 April, New Orleans. http://www.iccs.acer.edu.au/uploads/File/AERA2011/AERA_ICCS_SchoolsCommunity(NewOrleans2011).pdf. Accessed 16 Sept 2015.

  • Caponera, E., Losito, B., Mirti, P. (2012). Civic participation at school and school-based community participation. Paper presented at AERA Annual Meeting, 13–17 April, Vancouver. http://www.iccs.acer.edu.au/uploads/File/papers/AERASymposiumCivicParticipationPaper2.pdf. Accessed 14 Sept 2015.

  • Chiu, M. M., & Xihua, Z. (2008). Family and motivation effects on mathematics achievement: analyses of students in 41 countries. Learning and Instruction, 18(4), 321–336.

    Article  Google Scholar 

  • Coleman, J. S. (1975). Methods and results in the IEA studies of effects of school on learning. Review of Educational Research, 45(3), 335–386.

    Article  Google Scholar 

  • Coleman, J. S., Campbell, E. Q., Hobson, C. J., McPartland, J., Mood, A. M., Weinfeld, F. D., et al. (1966). Equality and educational opportunity. Washington, DC: US Congressional Printing Office.

    Google Scholar 

  • De Witte, K., & Van Klaveren, C. (2014). How are teachers teaching? A nonparametric approach. Education Economics, 22(1), 3–23.

    Article  Google Scholar 

  • Dodeen, H., & Hilal, M. (2012). The effect of teacher’s qualification practices, and perceptions on student Achievement in TIMSS mathematics: a comparison of two countries. International Journal of Testing, 12, 61–77.

    Article  Google Scholar 

  • Drent, M., Meelissen, M. R. M., & Van der Kleij, F. M. (2013). The contribution of TIMSS to the link between school and classroom factors and student achievement. Journal of Curriculum Studies, 45(2), 198–224.

    Article  Google Scholar 

  • Foy, P., Arora, A., & Stanco, G. M. (Eds.). (2013). TIMSS 2011 user guide for the international database. Chestnut Hill: Boston College.

    Google Scholar 

  • Fraillon, J., Schulz, W., Ainley, J., Van de Gaer, E. (2011). Multi-level analysis of factors explaining differences in civic knowledge. Paper prepared for the annual AERA meeting in New Orleans, 8–12 April. https://www.iccs.acer.edu.au/publications-and-papers. Accessed 16 Sept 2015.

  • Gao, S. (2014). Relationship between science teaching practices and students’ achievement in Singapore, Chinese Taipei, and the US: An analysis using TIMSS 2011 data. Frontiers of Education in China, 14(4), 519–551.

    Google Scholar 

  • Goe, L. (2007). The link between teacher quality and student outcomes: A research synthesis. Washington, DC: National Comprehensive Center for Teacher Quality.

    Google Scholar 

  • International Association for the Evaluation of Educational Achievement. (1998). IEA guidebook. Activities institutions and people. Amsterdam: International Association for the Evaluation of Educational Achievement (IEA).

    Google Scholar 

  • International Association for the Evaluation of Educational Achievement. (2012). International database analyzer (version 3.0). Hamburg: IEA Data Processing and Research Center.

    Google Scholar 

  • Ismail, N. A., & Awang, H. (2008). Assessing the effects of students’ characteristics and attitudes on mathematics performance. Problems of Education in the 21st Century, 9, 34–41.

    Google Scholar 

  • Lee, D., & Huh, Y. (2014). What TIMSS tells us about instructional practice in K-12 mathematics education. Contemporary Educational Technology, 5(4), 286–301.

    Google Scholar 

  • Levpušček, M. P., Zupančič, M., & Sočan, G. (2013). Predicting achievement in mathematics in adolescent students the role of individual and social factors. The Journal of Early Adolescence, 33(4), 523–551.

    Article  Google Scholar 

  • Luschei, T. F., & Chudgar, A. (2011). Teacher, student achievement and national income: a cross national examination of relationships and interactions. Prospects, 41, 507–533.

    Article  Google Scholar 

  • Martin, M. O., Foy, P., Mullis, I. V. S., & O’Dwyer, L. M. (2013). Effective schools in reading, mathematics, and science at the fourth grade. In M. O. Martin & I. V. S. Mullis (Eds.), TIMSS and PIRLS 2011: Relationships among reading, mathematics, and science achievement at the fourth grade—implications for early learning (pp. 109–178). Chestnut Hill: TIMSS & PIRLS International Study Center, Boston College.

    Google Scholar 

  • Martin, M. O., & Mullis, I. V. S. (Eds.). (2012). Methods and procedures in TIMSS and PIRLS 2011. Chestnut Hill: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.

    Google Scholar 

  • Martin, M. O., & Mullis, I. V. S. (Eds.). (2013). TIMSS and PIRLS 2011: Relationships among reading, mathematics, and science achievement at the fourth grade—implications for early learning. Chestnut Hill: TIMSS & PIRLS International Study Center, Boston College.

    Google Scholar 

  • Martin, M. O., Mullis, I. V. S., Gregory, K. D., Hoyle, C., & Shen, C. (2000). Effective schools in science and mathematics: IEA’s third international mathematics and science study. Chestnut Hill: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.

    Google Scholar 

  • McConney, A., & Perry, L. B. (2010). Socioeconomic status, self-efficacy, and mathematics achievement in Australia: A secondary analysis. Educational Research for Policy and Practice, 9(2), 77–91.

    Article  Google Scholar 

  • McNeal, R. B. (1999). Parental involvement as social capital: Differential effectiveness on science achievement, truancy, and dropping out. Social Forces, 78(1), 117–144.

    Article  Google Scholar 

  • Mullis, I. V. S., Martin, M. O., Foy, P., & Arora, A. (2012a). TIMSS 2011 international results in mathematics. Chestnut Hill: TIMSS & PIRLS International Study Center, Boston College.

    Google Scholar 

  • Mullis, I. V. S., Martin, M. O., Foy, P., & Drucker, K. T. (2012b). PIRLS 2011 international results in reading. Chestnut Hill: TIMSS & PIRLS International Study Center, Boston College.

    Google Scholar 

  • Mullis, I. V. S., Martin, M. O., Minnich, C. A., Stanco, G. M., Arora, A., Centurino, V. A. S., et al. (2012c). TIMSS 2011 encyclopedia: Education policy and curriculum in mathematics and science, volumes 1 and 2. Chestnut Hill: TIMSS & PIRLS International Study Center, Boston College.

    Google Scholar 

  • Nilsen, T., & Gustafsson, J. E. (2014). School emphasis on academic success: exploring changes in science performance in Norway between 2007 and 2011 employing two-level SEM. Educational Research & Evaluation, 20(4), 308–327.

    Article  Google Scholar 

  • OECD. (2005). School factors related to quality and equity. Results from PISA 2000. Paris: OECD.

    Google Scholar 

  • Peaker, G. F. (1975). An empirical study of education in twenty-one countries: A technical report. Stockholm: Almqvist & Wiksell International.

    Google Scholar 

  • Phan, H. T. (2008). Correlates of mathematics achievement in developed and developing countries: An HLM analysis of TIMSS 2003 eighth-grade mathematics scores. http://www.scholarcommons.usf.edu/etd/452. Accessed 14 Sept 2015.

  • Postlethwaite, N. (1967). School organization and student achievement: A study based on achievement in mathematics in twelve countries. Stockholm: Almqvist & Wiksell International.

    Google Scholar 

  • Postlethwaite, N., & Ross, K. N. (1992). Effective schools in reading. Implication for educational planners. Amsterdam: International Association for the Evaluation of Educational Achievement (IEA).

    Google Scholar 

  • Raudenbush, S. W., Bryk, A. S., Cheong, Y. F., & Congdon, R. T., Jr. (2004a). HLM6 hierarchical linear and nonlinear modeling. Lincolnwood: Scientific Software International Inc.

    Google Scholar 

  • Raudenbush, S. W., Bryk, A. S., & Congdon, R. (2004b). HLM 6 for Windows (Computer software). Lincolnwood: Scientific Software International Inc.

    Google Scholar 

  • Rivkin, S. G., Hanushek, E. A., & Kain, J. F. (2005). Teachers, schools and academic achievement. Econometrica, 73(2), 417–458.

    Article  Google Scholar 

  • Sandoval-Hernández, A., Castejón, A., & Aghakasiri, P. (2014). A comparison of school effectiveness factors for socially advantaged and disadvantaged students in ten European countries in TIMSS 2011. Šolsko polje, 25, 61–96.

    Google Scholar 

  • Schulz, W., Ainley, J., Fraillon, J., Kerr, D., & Losito, B. (2010). ICCS 2009 international report: Civic knowledge, attitudes, and engagement among lower-secondary school students in 38 countries. Amsterdam: International Association for the Evaluation of Educational Achievement (IEA).

    Google Scholar 

  • Schagen, I., & Twist, L. (2008). Adding value to PIRLS by combining with national data and using sophisticated modeling techniques. In: 3rd IEA International Research Conference, Taipei, Chinese Taipei.

  • Shepherd, D. (2013). A question of efficiency: decomposing South African reading test scores using PIRLS 2006. Stellenbosch University, Department of Economics Working Papers (20/2013). http://www.econpapers.repec.org/paper/szawpaper/wpapers196.htm. Accessed 15 Sept 2015.

  • Sirin, R. (2005). Socioeconomic status and academic achievement: A meta-analytic review of research. Review of Educational Research, 75(3), 417–453.

    Article  Google Scholar 

  • Sturman, L., & Lin, Y. (2011). Exploring the mathematics gap: TIMSS 2007. RicercaAzione, 3(1), 43–58.

    Google Scholar 

  • Torney-Purta, J., Lehmann, R., Oswald, H., & Schulz, W. (2001). Citizenship and education in twenty-eight countries. Civic knowledge and engagement at age fourteen. Amsterdam: International Association for the Evaluation of Educational Achievement (IEA).

    Google Scholar 

  • Tsai, L. T., & Yang, C. C. (2015). Hierarchical effects of school-, classroom-, and student-level factors on the science performance of eighth-grade Taiwanese students. International Journal of Science Education, 37(8), 1166–1181.

    Article  Google Scholar 

  • Van Daal, V.H.P., Begnum, A.C., Solheim, R.G., Adèr, H.J. (2006). Secondary analysis of PIRLS 2001 Norwegian data1. In: The second IEA international research conference: Proceedings of the IRC-2006, vol. 2 (pp. 177–190). Amsterdam: International Association for the Evaluation of Educational Achievement (IEA).

  • Walker, D. A. (1976). The IEA six subject survey: An empirical study of education in twenty-one countries. Amsterdam: International Association for the Evaluation of Educational Achievement (IEA).

    Google Scholar 

  • Wiberg, M., & Rolfsman, E. (2013). School effectiveness in science in Sweden and Norway viewed from a TIMSS perspective. Utbildning och Demokrati, 22(3), 69–84.

    Google Scholar 

  • Wiberg, M., Rolfsman, E., Laukaityte, I. (2013). School effectiveness in mathematics in Sweden and Norway 2003, 2007 and 2011. Paper presented at 5th IEA international research conference 26–28 June, Singapore. http://www.iea.nl/fileadmin/user_upload/IRC/IRC_2013/Papers/IRC-2013_Wiberg_etal.pdf. Accessed 14 Sept 2015.

  • Winnaar, L. D., Frempong, G., & Blignaut, R. (2015). Understanding school effects in South Africa using multilevel analysis: Findings from TIMSS 2011. Electronic Journal of Research in Educational Psychology, 13(1), 151–170. doi:10.14204/ejrep.38.16036.

    Article  Google Scholar 

Download references

Authors’ contributions

EC and BL are the authors of this research paper and have directly participated in the planning, execution, and analysis of this study. Both authors read and approved the final manuscript.

Competing interests

Both authors declare that they have no competing interests.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bruno Losito.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Caponera, E., Losito, B. Context factors and student achievement in the IEA studies: evidence from TIMSS. Large-scale Assess Educ 4, 12 (2016). https://doi.org/10.1186/s40536-016-0030-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40536-016-0030-6

Keywords