Skip to main content

An IERI – International Educational Research Institute Journal

Trends in educational inequalities in Ireland’s primary schools: an analysis based on TIMSS data (2011–2019)

Abstract

Background

Socioeconomic characteristics are persistently and systematically related to academic outcomes, despite long-standing efforts to reduce educational inequality. Ireland has a strong policy focus on alleviating educational disadvantage and has seen significant improvements in mathematics and science performance in recent years. This study investigates patterns of socioeconomic inequalities in 4th grade students’ performance in mathematics and science between 2011 and 2019. Two measures of inequality are examined: (i) inequality of achievement, i.e., the degree of variability in student performance and (ii) inequality of opportunity, i.e., the extent to which student performance is related to background characteristics.

Methods

Data for 4th-grade students in Ireland from TIMSS 2011, TIMSS 2015 and TIMSS 2019 were used. Mathematics and science achievement were the main outcome measures. The home resources for learning index was used as a proxy for student-level socioeconomic status. School-level socioeconomic status was examined according to schools’ participation in the Delivering Equality of Opportunity in Schools (DEIS) programme, which is the Department of Education’s main policy initiative addressing educational disadvantage. Descriptive and multilevel regression analyses were conducted to explore variability in student performance and to investigate the variance in achievement explained by socioeconomic factors, across cycles and subjects.

Results

Between 2011 and 2015, between-student and between-school differences in mathematics and science performance became smaller, as shown by the decrease in standard deviations and the intraclass correlation coefficients (ICCs). This points to reduced inequality of achievement. Between 2015 and 2019, a small increase in inequality of achievement was observed.

Regarding inequality of opportunity, students’ home resources for learning and school disadvantaged status were statistically significantly related to mathematics and science achievement across all three cycles. Overall variance explained by these two variables increased from 2011 to 2019. This points towards increasing inequality of opportunity over the period examined.

Performance gaps between disadvantaged and non-disadvantaged schools have been reduced over time; however, the relationship between home resources for learning and achievement appears to have strengthened. Findings were consistent for both subjects.

Conclusions

The findings indicate that improvements in overall performance do not necessarily reflect improved equality. Ireland’s improvements in average mathematics and science performance between 2011 and 2015 were accompanied by reduced inequality of achievement. Performance differences between disadvantaged and non-disadvantaged schools have been reduced over time, suggesting that the DEIS policy is meeting its goal of narrowing achievement gaps based on concentrations of educational disadvantage. However, inequality of opportunity linked to student-level socioeconomic factors (i.e., home resources for learning) appears to have increased over time. These findings are valuable in the context of measuring and tracking educational inequalities.

Introduction

International large-scale assessments, such as the Programme for International Student Assessment (PISA) or the Trends in International Mathematics and Science Study (TIMSS), are recognised as having important influences on national policy (Hopfenbeck et al., 2018; Looney et al., 2022; McNamara et al., 2022; Meyer & Benavot, 2013; Ringarp, 2016), with country rankings based on average achievement receiving considerable media attention in many countries. In order to support appropriate policy responses to educational disadvantage, it is necessary to go beyond country-level averages to gain a more nuanced understanding of the achievement levels of different student groups (Dronkers & de Heus, 2012; Rowley et al., 2020). As well as measuring student achievement in different subjects, large-scale assessments collect information about students’ backgrounds and home environments. Individual student background characteristics, particularly those related to socioeconomic status, are persistently and systematically related to academic performance in most countries, despite efforts to reduce educational inequality (Mullis et al., 2020; OECD, 2019; Sirin, 2005). In addition, since the early work of Coleman et al. (1966), considerable attention has been given to socioeconomic compositional effects—that is, how school-average socioeconomic status may influence student outcomes over and above individual student background (Flannery et al., 2023; Sciffer et al., 2020, 2022; Wilkinson, 2002).

In this paper, we draw on Ferreira and Gignoux's (2014) framework for conceptualising and measuring educational inequality in order to examine changes in inequality over time in Ireland—a country with a recognised policy focus on educational disadvantage (European Commission, 2019; Hepworth et al., 2021) as well as a track record of recent improvement in average achievement in mathematics and science (Perkins & Clerkin, 2020; Shiel et al., 2014). Using data from three TIMSS cycles, our analyses consider inequality of educational achievement and inequality of opportunity, by examining factors related to students’ socioeconomic background. The paper is organised as follows: firstly, we examine some key terminology and concepts regarding inequality in education. Secondly, we outline the Irish policy backdrop related to educational disadvantage, followed by a description of trends in achievement at primary level in mathematics and science in Ireland. Then, we describe methods used in the current study and present findings of our analysis. Finally, we consider our findings in the context of policy on educational disadvantage.

Inequality in education

The terms equality and equity in education can sometimes be confused and used interchangeably although some authors have ascribed specific meanings to each term depending on their particular perspectives. Equity is described by the OECD (2018) as a situation where “differences in students’ outcomes are unrelated to their background or to economic and social circumstances over which students have no control” (p. 13). This is closely related to Ferreira and Gignoux’s (2014) description of “inequality of opportunity” which is defined in terms of the proportion of the variance in test scores explained by pre-determined characteristics. Espinoza (2007) notes that equality usually indicates sameness in treatment, while equity is associated with fairness. What is evident across much of the literature on equality and equity in education is that the two concepts are interrelated and while the terminology across studies or models may differ, there is often a high degree of overlap in the concepts being discussed. Despite this variability, there is broad consensus that educational inequality is an important issue for educational systems and, in order to address it, it is necessary to identify and describe patterns of inequality and the factors that influence them (UNESCO, 2018). Comparisons of these concepts across studies can be facilitated by researchers describing in detail how the measures used in their studies relate to these concepts and, if relevant, the specific models drawn upon.

The current study is based on two definitions offered by Ferreira and Gignoux's (2014) framework for conceptualising and measuring inequality in education—inequality of educational achievement and inequality of opportunity. According to Ferreira and Gignoux, inequality of educational achievement is defined as the degree of variability in student performance and may be measured using the variance (or standard deviation) of test scores. Higher variance in test scores implies that there are larger differences between students’ observed performance—in other words, greater inequality of achievement. Lower variance in test scores is the more favourable outcome since it suggests that students are performing similarly to each other, which indicates greater equality in this sense.

The second element in Ferreira and Gignoux’s framework—inequality of opportunity—focuses on the extent to which students’ academic performance is influenced by background, demographic, socioeconomic or other pre-determined characteristics. Characteristics and circumstances typically considered in the context of inequality of opportunity include gender, ethnic origin and family socioeconomic status. Under this definition, a high degree of equality of opportunity would mean that students’ academic achievement is not influenced in a meaningful way by background characteristics that are outside of students’ control.

Ireland’s support for students at risk of educational disadvantage

While Ireland ranks highly in terms of reducing educational inequality among EU and OECD countries, substantial gaps still exist between the highest- and lowest-performing students (Chzhen et al., 2018; Nelis & Gilleece, 2023; Nelis et al., 2021). In Ireland, educational disadvantage is defined in legislation as “the impediments to education arising from social or economic disadvantage which prevent students from deriving appropriate benefit from education in schools” (Government of Ireland, 1998, p. 32). The Irish government’s most recent policy response, introduced in 2005 and updated in 2017, is entitled Delivering Equality of Opportunity in Schools (DEISFootnote 1). Of more than 3100 primary schools in Ireland in the 2006/2007 school year (the first school year in which all elements of the programme were fully implemented in all schools participating in DEIS), about 340 primary schools were classified as urban DEIS and a similar number as rural DEIS (Weir & Archer, 2011). Some schools have been added to DEIS since its inception, notably a comparatively small number in 2017Footnote 2 using a modified method of identification and a larger number in 2022, although the 2022 expansion is outside the timeframe of the data used in the current study. With limited exception (e.g., in the event of permanent school closure), schools have not been removed from DEIS. Prior to the 2022 expansion of the programme, approximately 20% of the overall school population attended DEIS schools (Department of Education, 2022; Department of Education and Science, 2005; Department of Education and Skills, 2017).

Under the DEIS programme, schools that are identified as serving the highest concentrations of students at risk of educational disadvantage receive additional supports, with some variation between primary and post-primary levels. At primary level, there are some differences between supports received by DEIS schools in urban and rural areas and, in urban areas only, DEIS schools are assigned to one of two bands. Band 1 comprises schools with the highest concentrations of socioeconomic disadvantage and Band 2 comprises schools experiencing less severe levels of disadvantage than those in Band 1. A key element of support for Band 1 schools is a reduction in class sizes, while an important support for both Band 1 and Band 2 schools is the provision of a Home School Community Liaison coordinator. All DEIS schools receive additional grant aid, priority access to teacher professional development and expanded provision from the National Educational Psychological Service. The full range of supports provided to DEIS schools at both primary and post-primary levels is outlined in Department of Education and Skills (2017).

Measuring educational inequality in Ireland and achievement in mathematics and science at primary level

An evaluation of Ireland’s DEIS programme found that the average reading and mathematics scores of students in 2nd, 3rd and 6th grades in DEIS primary schools were well below the corresponding national averages in 2007, 2010 and 2013. Mean scores in DEIS schools increased between 2007 and 2013 while at the same time, the percentages of students with very low scores decreased significantly and there were slight increases in the percentages of high scoring students (Weir, 2016; Weir & Denner, 2013). Small but statistically significant increases in reading and mathematics were also observed among subsamples of students in longitudinal cohorts (i.e., 2nd and 3rd grade students tested in 2010 were followed-up when they were in 5th and 6th grade in 2013). These findings indicate that some progress has been made in terms of literacy and numeracy in DEIS primary schools. However, it is not clear what elements of the DEIS programme may have been most significant in bringing about these improvements (Smyth et al., 2015) and more recent studies show that large achievement gaps remain between the average reading and mathematics scores of students in Urban Band 1 schools (the most disadvantaged DEIS urban schools) and those of their counterparts in urban non-DEIS primary schools (Nelis & Gilleece, 2023).

Improvements have also been observed in nationally representative samples of primary school students in national and international studies of achievement (Perkins & Clerkin, 2020; Shiel et al., 2014). In Ireland, significant increases in mathematics and science scores were observed between the 2011 and 2015 cycles of TIMSS, with very little subsequent change from 2015 to 2019 (Clerkin et al., 2016; Eivers & Clerkin, 2012a; Perkins & Clerkin, 2020). The increases in both domains between 2011 and 2015 were driven primarily by substantial improvements in performance among lower-achieving students, with comparatively smaller changes observed among higher-achieving students (Clerkin et al., 2016). The subsequent TIMSS 2019 results showed a slight widening of the tails at both ends of the distribution (Perkins & Clerkin, 2020). This change is not statistically significant, but nonetheless provides an indication of increasing differences in performance between the lowest-achieving (at the 5th percentile) and highest-achieving (at the 95th percentile) students in more recent years.

Other research in the Irish context has focused on the periods before and after the introduction of the National Literacy and Numeracy Strategy 2011–2020 (Department of Education and Skills, 2011), following which, substantial improvements were observed in Irish students’ performance in national and international assessments. Analyses of both national (NAMER) and international (TIMSS and PIRLS) assessment data have shown that as average student achievement in mathematics and reading has improved across two cycles, between-school differences and performance gaps between groups of students have become smaller (Karakolidis et al., 2021a, b, c). This demonstrates a trend towards reduced inequality (i.e., improved equality), after the introduction of the Strategy, as performance differences between students and between schools have decreased. Focusing specifically on mathematics, analysis of the results of the 2009 and 2014 national assessments (NAMER) shows more consistent patterns of reduced inequality compared to results from international assessments in a similar period (TIMSS), which are less clear and warrant further examination (Karakolidis et al., 2021c).

The current study

Karakolidis et al. (2021c) noted that the national assessments (NAMER) results show more consistent patterns of reduced inequality than results from international assessments (TIMSS). Therefore, the current study further investigates the TIMSS results by analysing the most recent (2019) TIMSS data along with the previous two cycles (2011 and 2015). Given the recognised economic and policy importance of STEM (Department of Education and Skills, 2019) and in line with similar recent efforts to shed greater light on the teaching and learning of science at primary level in Ireland (Nonte et al., 2022), data for science achievement are given greater attention in the current study, with student achievement in science examined alongside mathematics. Comparisons are drawn between the two subjects.

Using data for 4th grade students from three cycles of TIMSS, this study aims to explore changes in measures of inequality as overall performance increased during the period from 2011 to 2019 in Ireland. The study assesses the pattern and extent of achievement differences in mathematics and science in the context of factors related to socioeconomic status. Using Ferreira and Gignoux’s (2014) framework for conceptualising and measuring educational inequality as a starting point, we offer an enhanced approach to investigating how performance changes over time, and how such changes are distributed across groups of students from various socioeconomic backgrounds. Building on Ferreira and Gignoux’s framework, inequalities are examined at the student and school levels by estimating the variance in mathematics and science achievement attributed to between-school differences via the use of multilevel modelling. The study focuses on factors related to socioeconomic status—namely, home learning resources at the student level and schools’ disadvantaged status at the school level.

This paper aims to answer the following research questions (RQs):

RQ1: Has inequality of achievement in mathematics and science changed over the period from 2011 to 2019 in Ireland?

RQ2: Has inequality of opportunity attributed to socioeconomic factors changed over the period from 2011 to 2019 in in Ireland?

Specifically, this paper examines deviations in student mean scores in mathematics and science, variability in student performance within and between schools, and the extent to which student scores vary with respect to socioeconomic factors, across cycles and subjects. Findings provide valuable evidence for Ireland on an area of international importance and concern (inequalities of achievement and opportunity related to socioeconomic factors), and they also provide a useful alternative framework for analysing large-scale assessment data in other countries.

Methods

Data

This paper uses data from three cycles of TIMSS—a study of the International Association for the Evaluation of Educational Achievement (IEA) that was first administered in 1995 and has taken place every four years since then. The study involves the administration of curriculum-based mathematics and science assessments to students in selected classes. Questionnaires are also administered to students, teachers, parents/guardians and school principals to gather contextual information. For the current study, data for 4th grade students in Ireland from TIMSS 2011, TIMSS 2015 and TIMSS 2019 were used. In each of these cycles in Ireland, data were gathered using paper-based assessments; more details on the administration of TIMSS in Ireland can be found in the national reports for each cycle (Clerkin et al., 2016; Eivers & Clerkin, 2012b; Perkins & Clerkin, 2020), and further details on the international databases are available from the associated user guides (Fishbein et al., 2021; Foy, 2017; Foy et al., 2013).

Participants

TIMSS has a complex design that involves a two-stage cluster sampling methodology. First, a representative sample of schools is selected using stratified sampling based on probability proportional to size, and then either one or two intact classes within each sampled school are randomly selected to provide a representative sample of student participants (LaRoche et al., 2020). In Ireland, stratifying variables at primary level are school DEIS status, language of instruction and gender mix (Perkins & Clerkin, 2020). In most schools, two classes are selected; however, there are some small schools in which there is only one class at each grade level, and in these schools only one class can be selected for participation. Table 1 presents the achieved sample sizes for each cycle of TIMSS examined in the current study. School and student response rates in each cycle were very high, ranging at school-level from 98% (prior to replacement) to 100% and at student-level, from 91 to 96% (Clerkin et al., 2016; Eivers & Clerkin, 2012b; Perkins & Clerkin, 2020).

Table 1 Participating sample sizes in TIMSS 2011, 2015 and 2019

Measures

Mathematics and science achievement are the main outcome measures. As an international assessment involving more than 60 countries, the TIMSS assessment is designed to represent the curricular content of most participating countries in the chosen domains (mathematics and science) (Mullis & Martin, 2017). This means that the assessment content will rarely match any individual country’s curriculum perfectly. However, a test-curriculum matching analysis confirmed that the majority of mathematics and science items in the TIMSS assessment were judged by expert raters to have been covered by 4th grade students in Ireland by the time of testing. This analysis also confirmed that the estimates of student achievement are largely unaffected by the selection of items (Clerkin et al., 2016; Perkins & Clerkin, 2020).

In TIMSS, student performance in each subject is reported with reference to an item response theory (IRT)-based scale with an international centrepoint of 500 and a standard deviation of 100. This scale was established in the first cycle of TIMSS in 1995, and subsequent assessments have been linked to the initial scale to allow for comparisons of performance across cycles and over time (Foy et al., 2020). Each student is assigned five plausible value achievement estimates for each assessment domain and subdomain. TIMSS questionnaires are administered in conjunction with the tests to gather contextual information from students, parents/guardians, teachers, and school principals. These data can be linked to students’ achievement data.

At the student level, the Home resources for learning index (available in the international TIMSS database) was used in this study as a proxy for socioeconomic status. The home resources for learning index is constructed from a combination of data from questionnaires completed by students and their parents/guardians. Students are asked to provide information about the number of books in their homes and other study supports (an internet connection and having their own room to study in). Parents are asked to provide information about the number of children’s books at home, the parents’ own levels of education, and the parents’ occupations. Using the measures of availability of these five home resources for learning, students are assigned a score on the scale. The TIMSS context questionnaire scale was established in 2011 based on the combined response distribution of all countries that participated in that cycle. In order to provide a point of reference that would facilitate comparisons across countries, the scale centrepoint of 10 was situated at the mean of the combined distribution, and the scale units were selected so that two scale score-points corresponded to the standard deviation of the distributionFootnote 3 (Mullis et al., 2020).

Table 2 presents the means, standard deviations and associated standard errors of the home resources for learning index for Ireland, across cycles. In 2015, students’ mean score on the home resources for learning index was statistically significantly higher than in 2011. The difference in mean home resources for learning between 2015 and 2019 was not statistically significant and the 2019 value was also significantly higher than the corresponding value in 2011. Similar patterns are observed in the aggregated school mean home resources for learning, across cycles (See Additional file 1: Table S1).

Table 2 Means, standard deviations and standard errors of the home resources for learning index

At the school level, school DEIS status was used as a proxy for the socioeconomic profile of students in the school. It is worth emphasising that DEIS schools in TIMSS 2011 and 2015 were identified using the original identification procedure based on principals’ reports of school socioeconomic profile (Archer & Sofroniou, 2008). DEIS schools in TIMSS 2019 were identified either using the original identification process or the 2017 updated process so, for DEIS schools in TIMSS 2019, the duration of additional supports can vary according to whether a school was part of the original tranche of DEIS schools or the newer 2017 group. Because of the small numbers of schools sampled in each DEIS category and the very large standard errors associated with their estimates, DEIS is examined as a binary variable in the current study (i.e., Urban Band 1 schools, Urban Band 2 schools and DEIS Rural schools are grouped, so that schools are classified as either DEIS or non-DEIS). Table 3 presents this binary variable—school DEIS status—and the percentages of students belonging to each category in these three cycles of TIMSS. In each cycle, the percentage of students in DEIS schools in TIMSS (18–19%) is very similar to the percentage of primary students in the population (20%) in DEIS schools (Department of Education, 2022).

Table 3 Percentages of students participating in TIMSS attending disadvantaged (DEIS) and non-disadvantaged (non-DEIS) schools

Table 4 presents mean home resources for learning for students attending disadvantaged (DEIS) and non-disadvantaged (non-DEIS schools). As expected, students in DEIS schools have statistically significantly fewer home resources for learning on average than students attending non-DEIS schools.

Table 4 Means, standard deviations and standard errors of the home resources for learning index, by school disadvantage (DEIS) status

Analysis

To examine inequalities in mathematics and science performance, a series of univariate, bivariate and multilevel linear regression analyses were conducted. Patterns of student mean performance over time, as well as the variability around mean scores, were examined across cycles; a smaller standard deviation would imply less variance in students’ scores and therefore greater equality of achievement, whereas a larger standard deviation would suggest more variance and greater inequality. Inequality of achievement was also examined at the school level using multilevel linear regression analysis, with students at level one and schools at level two and estimating a statistic called the intraclass correlation coefficient (ICC, see Eq. 1). The greater the variability among schools relative to the total variability in achievement (i.e., the ICC), the greater the inequality of achievement.

$$ICC = \frac{variance\ among\ schools}{total\ variance}$$
(1)

where total variance equals between-student variance plus between-school variance.

Measurement of inequality of opportunity requires some quantification of the relationship between demographic or socioeconomic characteristics and students’ academic performance. Bivariate analyses were performed to identify the extent to which the examined explanatory variables were related to student performance in mathematics and science over time. Hedges’ g effect sizes were computed to show the magnitude of the performance gaps, with higher effect sizes indicating higher levels of inequality of opportunity. Finally, multilevel linear regression analysis was performed to examine the extent to which home resources for learning and school disadvantaged (DEIS) status contributed to the explanation of students’ performance, independently of each other, and to estimate the variance in achievement explained by the selected socioeconomic factors and compare differences across the three TIMSS cycles. If a large proportion of variance in achievement were explained by socioeconomic indicators, this would indicate high levels of inequality of opportunity, since it suggests that students’ achievement is strongly related to their socioeconomic characteristics. Conversely, if only a small proportion of variance in achievement is explained by socioeconomic indicators, this suggests greater equality of opportunity, as students’ performance outcomes are not as strongly linked to these background factors.

All five plausible values in the TIMSS datasets were taken into account in all analyses. The total student weights along with the replicate weights were used in the univariate and bivariate analyses. In the multilevel analysis, sampling weights were used at both levels. At level one, the total student weights were scaled to add up to the school sample size, while at level two, the final school weights were used (Karakolidis et al., 2022; Mang et al., 2021). Replicate weights were not incorporated into the multilevel analysis. The use of replicate weights in the univariate and bivariate analyses and the application of multilevel modelling allowed us to account for the clustered nature of the TIMSS sample (Foy & LaRoche, 2020; Woltman et al., 2012). The sampling design of such studies should not be ignored in the analysis as students within the selected clusters (classes and schools) may be more similar to each other than they are to students in the target population in general. This lack of independence of the analysis units can lead to underestimation of standard errors, smaller p-values and subsequently increased risk of a Type I error (Field, 2017). The Full Information Maximum Likelihood method was used to allow the inclusion of the few cases with missing home resources for learning data in the multilevel models.

The univariate and bivariate analyses were performed using the IDB Analyzer 5 (IEA, 2022) software package. The multilevel analyses were conducted using Mplus 8 (Muthén & Muthén, 2017).

Results

Inequality of achievement—overall performance and variability in scores (RQ1)

As shown in Table 5, there were statistically significant improvements between 2011 and 2015 in the average mathematics and science performance of 4th grade students. Specifically, students’ performance increased by 20 score-points in mathematics and 13 score-points in science. Overall performance in both subjects remained stable between 2015 and 2019.

Table 5 Average performance and standard deviation in mathematics and science on TIMSS 2011, 2015 and 2019

Alongside the considerable increase in overall student performance in 2015, there was a statistically significant reduction in standard deviations around the mean scores in mathematics and science. This can be considered evidence of decreased inequality (or increased equality) of achievement, as the variability of mathematics and science scores around the mean is smaller and students appear to perform more similarly to each other.

Although performance remained relatively stable in TIMSS 2019 compared with TIMSS 2015, the variation around the mean score increased. This increase in standard deviation between 2015 and 2019 was statistically significant in science, but not in mathematics.

To further examine the variability of student mathematics and science scores within and between schools, multilevel regression analysis of student scores was conducted. As indicated by the relevant ICCs in the null models (without any explanatory variables), there was a substantial decrease in the variance attributed to differences between schools between 2011 and 2015, both in mathematics and science (Table 6). Specifically, the between-school variance decreased by 42.4% in mathematics (from 17.7% in 2011 to 10.2% in 2015) and 34.7% in science (from 21.3% in 2011 to 13.9% in 2015). The reductions in the ICCs indicate that performance in TIMSS 2015 was less “dependent” on the school that students attended, which can be perceived as evidence of reduced inequality (or increased equality) among students and schools.

Table 6 Percent of between-school variance in mathematics and science scores

However, this was not the case in 2019, when there was a slight increase relative to 2015 in the proportion of the variance in mathematics and science performance attributed to between-school differences. The increase in ICC between 2015 and 2019 was larger for science than for maths, although it was very small in both cases.

The between-school variation in home resources for learning remained relatively stable across cycles (See Additional file 1: Table S2).

Inequality of opportunity—variance explained by socioeconomic factors (RQ2)

Table 7 presents the relationship between student performance in mathematics and science and the student-level socioeconomic indicator—home resources for learning—in the last three cycles of TIMSS. In mathematics, despite a small increase between 2011 and 2015, the strength of this relationship was very similar across all three cycles. The relationship between science performance and home resources for learning remained unchanged over the years.

Table 7 Correlation between home resources for learning and performance in mathematics and science

The results of the school-level analyses show a different pattern from the analysis at the student level. Table 8 shows the differences in students’ performance in mathematics and science in TIMSS 2011, 2015 and 2019 based on the school-level socioeconomic indicator (school disadvantaged status). Overall, students in disadvantaged (DEIS) schools were outperformed by their peers in non-disadvantaged (non-DEIS) schools in both mathematics and science in all three TIMSS cycles. The performance gap between students in DEIS and non-DEIS schools remains statistically significant in all three cycles. The magnitude of these gaps tended to become smaller in the more recent cycles, as indicated by the relevant effect sizes (Hedges’ g). However, the reduction of the difference in achievement across cycles was not statistically significant at any of the points of comparison.

Table 8 Performance differences in mathematics and science in TIMSS 2011, 2015 and 2019, by schools’ disadvantaged (DEIS) status

Multilevel linear regression

Multilevel linear regression analysis was conducted to examine how each variable (home resources for learning and DEIS status) contributes to the explanation of variance in mathematics and science performance when the other variable is taken into account (Tables 9 and 10). The multilevel analysis tables present the unstandardized coefficients (B), which are based on scaled scores, along with their respective standard errors (SE) for each one of the examined variables, as well as the variance in achievement explained by the model at the student level, the school level and overall.

Table 9 Multilevel modelling of mathematics achievement
Table 10 Multilevel modelling of science achievement

Both home resources for learning and DEIS status were statistically significant predictors of mathematics and science performance in all six models (one per TIMSS cycle, per domain). After accounting for DEIS status, the relationship between home resources for learning and mathematics and science performance was stronger in the more recent cycles of TIMSS, compared with the earliest cycle in 2011. For instance, having controlled for DEIS status, a one-point increase in a student’s home resources for learning index would be expected to lead to an increase of 16.83 score-points in their mathematics score in TIMSS 2011, while the same increase in a student’s home resources for learning in TIMSS 2015 and 2019 would result in a 17.89 and 18.66 score-points change, respectively. These increases in the magnitude of the coefficients, however, are not statistically significant in either mathematics or science.Footnote 4

On the other hand, even after accounting for student home resources for learning, the gap in achievement between DEIS and non-DEIS schools decreased across cycles. For example, the predicted science performance gap between students in DEIS and non-DEIS schools decreased from 27.10 in 2011 to 15.49 in 2019; a decrease of almost 12 score-points. A similar pattern was observed in mathematics. While there was a substantive decrease in the performance gap between DEIS and non-DEIS schools, after accounting for home resources for learning, over time, the change was not statistically significant (which can be attributed to the increased standard errors around the coefficients of this school-level factor).

In science, the combined explanatory power of these two variables increased over time from 16.8% in 2011 to 20.2% in 2019, indicating an increase in inequalities. In mathematics, the changes across cycles are smaller; the overall explained variance increased from 16.3% in 2011 to 18.7% in 2015, and then decreased to 17.6% in 2019.

Regression slopes were allowed to vary across schools. However, this did not significantly improve the fit of any of the mathematics or science models. This indicates that the positive relationship between home resources for learning and performance in mathematics and science does not appear to vary significantly across schools, within each cycle of TIMSS. Cross-level interactions between home learning resources and schools’ disadvantaged (DEIS) status were also examined for each cycle, but none reached statistical significance.

Summary of analyses

Figures 1 and 2 summarise the main results of the analyses for mathematics and science, respectively. After a significant increase in students’ average mathematics and science scores in 2015 (grey bars) which was accompanied by a notable decrease in standard deviations, average performance in 2019 remained relatively stable while standard deviations increased slightly (but not statistically significantly).

Fig. 1
figure 1

Mean scores, standard deviations, ICCs and explained variances across the three TIMSS cycles, mathematics

Fig. 2
figure 2

Mean scores, standard deviations, ICCs and explained variances across the three TIMSS cycles, science

Changes in ICC over time (green line) indicate changes in inequality of achievement. Changes in variance explained over time (blue, red and yellow lines) indicate changes in inequality of opportunity. As the lines in the figures go down inequalities decrease (or equalities increase) while, as the lines move up, inequalities increase (or equalities decrease). In general terms across mathematics (Fig. 1) and science (Fig. 2), both the total variance and the student-level variance explained in both subjects seem to increase over time, particularly between 2011 and 2015. This indicates increasing inequalities. However, inequalities appear to decrease over time at the school level.

These findings are generally consistent for both mathematics and science.

Discussion and conclusions

The analyses presented above provide a nuanced picture of changes in educational inequality in Ireland over the last decade. From the perspective of inequality of educational achievement (variation in observed student performance on a measure of achievement), these analyses show that inequality of mathematics and science achievement declined considerably between 2011 and 2015, in conjunction with significant increases in overall performance. This echoes the findings of Karakolidis et al. (2021c) whose study examined changes in equality of mathematics achievement and subgroup performance differences in Irish primary school students over time using both national (NAMER) and international (TIMSS) assessment data.

The decrease in inequality of mathematics achievement between TIMSS 2011 and 2015 is also consistent with results of similar analysis which focused only on data from NAMER 2009 and 2014 (Karakolidis et al., 2021b). This study found that the magnitude of performance gaps in mathematics and reading between students in different subgroups (e.g., those in DEIS and non-DEIS schools) shrank over time, while overall performance increased. The improvements in achievement observed in NAMER 2014 particularly favoured groups of students who had had lower performance than their counterparts in 2009, leading to smaller performance gaps in 2014 and indicating increased equality of achievement (Karakolidis et al., 2021a, b).

In the current study, a small increase in inequality of achievement in mathematics and a larger increase in inequality of achievement in science were observed between 2015 and 2019, although average performance remained stable in both domains during that period.

It should be noted that an increase in inequality of educational achievement does not necessarily indicate a worsening of student outcomes. In fact, in some circumstances, it could be consistent with generally positive changes; for example, if the number of lower-achieving students increased slightly but the numbers of higher-achieving students increased by a larger amount over the same period, the variation in performance across the distribution would increase (indicating greater inequality of achievement) despite improvements in both groups’ performance. In the specific case of changes in Ireland between TIMSS 2015 and TIMSS 2019, the small increase in the standard deviation in mathematics can be attributed to the small increase in the percentages of high achievers (those performing at the advanced international benchmark), while the percentages of low achievers (those performing at the lowest international benchmark) remained stable. However, in science, the widening of the distribution of achievement was more apparent as the percentages of students categorised at both the highest and lowest levels increased (Perkins & Clerkin, 2020). The increase in variation also reflects a widening of the distribution of mathematics and science achievement at both tails (e.g., the performance gap between students performing at the 5th and the 95th percentile) (Perkins & Clerkin, 2020). These changes, although small in magnitude, are of more concern.

Recent findings have highlighted the relative underperformance of high-achieving students in Ireland compared with other countries in both mathematics and science (McKeown et al., 2019; Perkins & Clerkin, 2020; Pitsia et al., 2019). This suggests a need for renewed policy focus on supporting higher-achieving students to reach their full potential. In so doing, however, it will be important to maintain a parallel focus on provision of support and resources to lower-achieving students and those at risk of disadvantage with the aim of increasing performance across the distribution of achievement, not only among those on one side of it.

From the perspective of inequality of educational opportunity (the extent to which educational outcomes can be predicted by demographic or socioeconomic factors), in general, the results of the current study demonstrate increases in inequality of opportunity in Ireland across the three cycles of TIMSS from 2011 to 2019. In other words, the proportion of variance in student achievement that can be explained by a combination of students’ home resources for learning and their schools’ disadvantaged (DEIS) status has increased over time. The exception is TIMSS 2015 mathematics, which shows a more inconsistent pattern; the overall variance explained by DEIS status and students’ home resources for learning in 2015 was higher than that observed in both TIMSS 2011 and 2019.

This increase in inequality of opportunity appears to be largely driven by a strengthening relationship between achievement and home resources for learning from 2011 to 2019, after accounting for schools’ DEIS status and the clustered nature of the data. Over the same period, the proportion of variance explained by schools’ disadvantaged (DEIS) status decreased and, indeed, the observed difference in average achievement between disadvantaged and non-disadvantaged schools has been reduced in both science and mathematics. This finding is consistent with results of analyses of NAMER mathematics data (Karakolidis, 2021b), in which the magnitude of the performance gap between students in disadvantaged and non-disadvantaged schools decreased from 2009 to 2014. This may indicate that the implementation of the DEIS programme has achieved some success in reducing educational inequalities at the school level (although, notably, significant differences in educational outcomes remain). However, this reduction of inequality at the school level has further highlighted a persistent underlying inequality of resources at the individual student/household level and its association with academic performance.

The increase in inequality of opportunity noted in the current study seems to conflict with the general trend observed by Karakolidis et al. (2021b) in their analysis of national assessment (NAMER) data. Their models of mathematics achievement showed that the variance explained by student and school factors in 2014 was about half the magnitude of the variance explained in 2009. This suggests that students’ mathematics performance was less strongly related to the examined background characteristics (e.g., DEIS status, number of books in students’ homes, parent education) in 2014, compared with 2009, which is an indicator of reduced inequality of opportunity over that period. The differences between the patterns observed by Karakolidis et al. (2021b) and those observed in the current study demonstrate that different findings may emerge from national and international assessments. These differences also highlight the importance of examining changes over time from multiple perspectives and sources of evidence.

This paper offers an enhanced approach to analysing large-scale assessment results that goes beyond the comparison of mean scores across cycles. This approach allows for the use of international assessment data to examine educational inequality in a national context, which may be particularly useful for countries whose national assessments do not provide data on educational inequality, or for countries which may have some national data but lack representative national data for a particular domain, as is the case for science in Ireland.

Limitations and future research

The analyses in the current study focus on variability in student performance, subgroup differences and the predictive power of socioeconomic factors, taking the point of reference as the previous assessment cycle rather than a minimal achievement standard. Overall performance standards are a relevant consideration within the discussion of equality, and this point is discussed in Espinoza’s equality-equity framework as “equity for equal needs” at the output stage, which relates to the provision of sufficient resources to ensure that every student reaches “a minimal needed achievement level (the minimum achievement definition: Gordon, 1972) and that differences in achievement beyond that are based on need” (Espinoza, 2007, p. 353). This issue could be further investigated by examining the proportions of students from more socioeconomically disadvantaged groups (those groups for which inequalities seem to have increased) who are achieving at low levels. Indeed, further examinations of patterns of achievement with respect to socioeconomic factors might provide more insight into the distribution of educational inequalities. Furthermore, in 2021, another cycle of NAMER was conducted. Future research could usefully focus on how the pattern of equality/inequality of achievement in performance on NAMER has developed since 2014, and how this compares to the results of TIMSS 2019 described in the current study. Likewise, at the time of writing, TIMSS 2023 has just been conducted, and data from this cycle may be used in future research to investigate any changes or development in patterns of inequalities.

The current study examines inequality of achievement and opportunity based on schools’ disadvantaged (DEIS) status and students’ home resources for learning but, of course, these are not the only factors related to socioeconomic status and achievement. Future research could draw on the work of Agasisti et al. (2021), which examines the role of school climate in patterns of achievement in the context of socioeconomic factors.

One methodological limitation of this paper relates to the fact that the meaning of the DEIS variable has changed over time due to the revision and refinement of the model that identifies schools for inclusion in the DEIS programme. With very limited exception (e.g., in the context of permanent school closures), schools that entered the DEIS programme originally have remained in the programme. Subsequent focus was on expanding the scheme with less attention given to whether schools should remain in the programme indefinitely once admitted (Harford & Fleming, 2022). The refinement of the model in 2017 has meant that the characteristics of some DEIS schools that participated in TIMSS 2019 may differ slightly from those included in previous TIMSS cycles. The 2019 sample contained some schools that had been categorised as disadvantaged (DEIS) for less than two years, and these were nominally equivalent with schools that may have been designated as DEIS since as early as 2005. The DEIS variable is valuable for its strong policy relevance and high levels of public awareness. However, its changing meaning complicates cross-cycle comparisons such as those presented in this study.

This points to a broader challenge of policy monitoring and evaluation (European Commission, 2022). As demonstrated in the results from the current study, two measures can provide very different perspectives on the same constructs. Looking at school-level disadvantage, the association between school DEIS status and achievement appears to decrease over time, suggesting greater equality of opportunity in one respect, while the association between student-level home resources for learning and achievement appears to increase, suggesting greater inequality of opportunity in another.

Although the home resources for learning index may be more robust than the DEIS variable from a methodological perspective, it does not carry the same meaning and policy relevance, for an Irish audience at least. The importance of examining issues using multiple sources of evidence is underlined by each variable’s particular merits, and indeed by the differing perspectives they offer which, when combined, can offer a more nuanced understanding.

Availability of data and materials

The data from TIMSS 2011, TIMSS 2015 and TIMSS 2019 used in this research paper (with the exception of the school disadvantaged status [DEIS] variable which is confidential sampling information and not published) are available from https://timssandpirls.bc.edu/timss2011/international-database.html, https://timssandpirls.bc.edu/timss2015/international-database/, and https://timss2019.org/international-database/, respectively.

Notes

  1. “Deis” is the Irish language word for “opportunity”.

  2. 79 schools across primary and post-primary levels, with a further 30 primary schools re-classifed from Urban Band 2 to Band 1.

  3. Approximately 8% of cases in the Home resources for learning variable were missing in each TIMSS cycle.

  4. The statistical significance of the changes in the regression coefficients across cycles was tested using the z distribution (Clogg et al., 1995). \(Z = \frac{{b}_{1} - {b}_{2 }}{\sqrt{SE{b}_{1}^{ 2}+SE{b}_{1}^{ 2}}}\)

Abbreviations

TIMSS:

Trends in International Mathematics and Science Study

DEIS:

Delivering Equality of opportunity in Schools

PISA:

Programme for International Student Assessment

OECD:

Organisation for Economic Co-operation and Development

UNESCO:

United Nations Educational, Scientific and Cultural Organisation

EU:

European Union

NAMER:

National Assessments of Mathematics and English Reading

STEM:

Science, Technology, Engineering and Mathematics

RQ:

Research Question

IEA:

International Association for the Evaluation of Educational Achievement

IRT:

Item response theory

IBM:

International Business Machines Corporation

IDB:

International Data Base

SD:

Standard deviation

SE:

Standard error

ICC:

Intraclass correlation

References

  • Agasisti, T., Avvisati, F., Borgonovi, F., & Longobardi, S. (2021). What school factors are associated with the success of socio-economically disadvantaged students? An empirical investigation using PISA data. Social Indicators Research, 157(2), 749–781. https://doi.org/10.1007/s11205-021-02668-w

    Article  Google Scholar 

  • Archer, P., & Sofroniou, N. (2008). The assessment of levels of disadvantage in primary schools for DEIS. Educational Research Centre. https://www.erc.ie/documents/deis_assess_disadv_prim_sch.pdf. Accessed 28 Nov 2023.

  • Chzhen, Y., Rees, G., Gromada, A., Cuesta, J., & Bruckauf, Z. (2018). Innocenti report card 15. In M. Drohan (Ed.), An unfair start: Inequality in children’s education in rich countries. United Nations Children’s Fund (UNICEF).

    Google Scholar 

  • Clerkin, A., Perkins, R., & Cunningham, R. (2016). TIMSS 2015 in Ireland: Mathematics and science in primary and post-primary schools. Educational Research Centre. https://www.erc.ie/wp-content/uploads/2016/11/TIMSS-initial-report-FINAL.pdf. Accessed 28 Nov 2023.

  • Clogg, C. C., Petkova, E., & Haritou, A. (1995). Statistical methods for comparing regression coefficients between models. American Journal of Sociology, 100(5), 1261–1293. https://doi.org/10.1086/230638

    Article  Google Scholar 

  • Coleman, N. J., Campbell, E., Hobson, C., McPartland, J., Mood, A., Weinfall, F., & York, R. (1966). Equality of educational opportunity. Department of Health, Education and Welfare.

    Google Scholar 

  • Department of Education. (2022). The refined DEIS identification model. Author. https://www.gov.ie/pdf/?file=https://assets.gov.ie/220043/d6b98002-a904-427f-b48a-0fa0af756ea7.pdf#page=null. Accessed 28 Nov 2023.

  • Department of Education and Science. (2005). DEIS (Delivering equality of opportunity in schools): An action plan for educational inclusion. Author. http://edepositireland.ie/bitstream/handle/2262/89931/deis_action_plan_on_educational_inclusion.pdf?sequence=2&isAllowed=y. Accessed 28 Nov 2023.

  • Department of Education and Skills. (2011). Literacy and Numeracy for Learning and Life: The National Strategy to Improve Literacy and Numeracy among Children and Young People 2011–2020. Author. https://assets.gov.ie/24521/9e0e6e3887454197a1da1f9736c01557.pdf. Accessed 28 Nov 2023.

  • Department of Education and Skills. (2017). DEIS plan 2017: Delivering equality of opportunity in schools. Author. https://www.gov.ie/pdf/?file=https://assets.gov.ie/24451/ba1553e873864a559266d344b4c78660.pdf#page=null. Accessed 28 Nov 2023.

  • Department of Education and Skills. (2019). STEM Education Policy Statement 2017–2026. Author. https://www.gov.ie/pdf/?file=https://assets.gov.ie/43627/06a5face02ae4ecd921334833a4687ac.pdf#page=null. Accessed 28 Nov 2023.

  • Dronkers, J., & de Heus, M. (2012). Immigrants’ children scientific performance in a double comparative design: the influence of origin, destination, and community. Discussion Paper Series CDP No 13/12. Centre for Research and Analysis of Migration (CReAM)

  • Eivers, E., & Clerkin, A. (2012a). PIRLS & TIMSS 2011: Reading, mathematics and science outcomes for Ireland. Educational Research Centre. https://assets.gov.ie/24993/5fc773438c374aaf9108805ec229d972.pdf. Accessed 28 Nov 2023.

  • Eivers, E., & Clerkin, A. (2012b). PIRLS and TIMSS 2011: Technical Report for Ireland. Educational Research Centre. https://www.erc.ie/documents/pt_2011_technical_report.pdf. Accessed 28 Nov 2023.

  • Espinoza, O. (2007). Solving the equity–equality conceptual dilemma: A new model for analysis of the educational process. Educational Research, 49(4), 343–363. https://doi.org/10.1080/00131880701717198

    Article  Google Scholar 

  • European Commission. (2019). PISA 2018 and the EU: Striving for social fairness through education. Publications Office of the European Union.

    Google Scholar 

  • European Commission. (2022). Investing in our future: Quality investment in education and training. Publications Office of the European Union. https://doi.org/10.2766/45896

    Article  Google Scholar 

  • Ferreira, F. H. G., & Gignoux, J. (2014). The measurement of educational inequality: Achievement and opportunity. World Bank Economic Review, 28(2), 210–246. https://doi.org/10.1093/wber/lht004

    Article  Google Scholar 

  • Field, A. (2017). Discovering statistics using IBM SPSS statistics (5th ed.). SAGE.

    Google Scholar 

  • Fishbein, B., Foy, P., & Yin, L. (2021). TIMSS 2019 user guide for the international database (2nd ed.). TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College and International Association for the Evaluation of Educational Achievement (IEA).

  • Flannery, D., Gilleece, L., & Clavel, J. G. (2023). School socio-economic context and student achievement in Ireland: An unconditional quantile regression analysis using PISA 2018 data. Large-Scale Assessments in Education, 11, 19. https://doi.org/10.1186/s40536-023-00171-x

    Article  Google Scholar 

  • Foy, P. (2017). TIMSS 2015 User Guide for the International Database. TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College and International Association for the Evaluation of Educational Achievement (IEA).

  • Foy, P., Arora, A., & Stanco, G. M. (2013). TIMSS 2011 user guide for the international database. TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College and International Association for the Evaluation of Educational Achievement (IEA).

  • Foy, P., Fishbein, B., von Davier, M., & Yin, L. (2020). Implementing the TIMSS 2019 scaling methodology. In M. O. Martin, M. von Davier, & I. V. S. Mullis (Eds.), Methods and Procedures: TIMSS 2019 Technical Report. TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA).

  • Foy, P., & LaRoche, S. (2020). Estimating standard errors in the TIMSS 2019 results. In M. O. Martin, M. von Davier, & I. V. S. Mullis (Eds.), Methods and Procedures: TIMSS 2019 Technical Report. TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA).

  • Gordon, E. W. (1972). Toward defining equality of educational opportunity. In F. Mosteller & D. Moynihan (Eds.), On Equality of Educational Opportunity (pp. 423–434). Random House.

    Google Scholar 

  • Government of Ireland. (1998). Education Act, 1998. https://www.irishstatutebook.ie/eli/1998/act/51/enacted/en/pdf. Accessed 28 Nov 2023.

  • Harford, J., & Fleming, B. (2022). Education in Ireland still shaped by social class despite decades of investment. Opinion: Measures to tackle disadvantage grossly underfunded relative to challenges. The Irish Times. https://www.irishtimes.com/news/education/education-in-ireland-still-shaped-by-social-class-despite-decades-of-investment-1.4827899. Accessed 28 Nov 2023.

  • Hepworth, N., Galvis, M., Gambhir, G., & Sizmur, J. (2021). Using PISA 2018 to inform policy: Learning from the Republic of Ireland Research Brief. National Foundation for Education Research.

    Google Scholar 

  • Hopfenbeck, T. N., Lenkeit, J., El Masri, Y., Cantrell, K., Ryan, J., & Baird, J.-A. (2018). Lessons learned from PISA: A systematic review of peer-reviewed articles on the Programme for International Student Assessment. Scandinavian Journal of Educational Research, 62(3), 333–353. https://doi.org/10.1080/00313831.2016.1258726

    Article  Google Scholar 

  • IEA. (2022). Help manual for the IEA IDB analyzer (Version 5.0). https://www.iea.nl. Accessed 28 Nov 2023.

  • Karakolidis, A., Duggan, A., Kiniry, J., & Shiel, G. (2021a). Who benefits from improved outcomes in reading literacy in Ireland? An investigation of equality using national and international assessment data. In 22nd Association for Educational Assessment (AEA)—Europe Annual Conference.

  • Karakolidis, A., Duggan, A., Shiel, G., & Kiniry, J. (2021b). Educational inequality in primary schools in Ireland in the early years of the National Literacy and Numeracy Strategy: An analysis of National Assessment data. Irish Journal of Education, 44(1), 1–24.

    Google Scholar 

  • Karakolidis, A., Duggan, A., Shiel, G., & Kiniry, J. (2021c). Examining educational inequalities: Insights in the context of improved mathematics performance on national and international assessments at primary level in Ireland. Large-Scale Assessments in Education, 9, 5. https://doi.org/10.1186/s40536-021-00098-1

    Article  Google Scholar 

  • Karakolidis, A., Pitsia, V., & Cosgrove, J. (2022). Multilevel modelling of international large-scale assessment data. In M. S. Khine (Ed.), Methodology for multilevel modeling in educational research (pp. 141–159). Berlin: Springer. https://doi.org/10.1007/978-981-16-9142-3_8

    Chapter  Google Scholar 

  • La Roche, S., Joncas, M., & Foy, P. (2020). Sample design in TIMSS 2019. In M. O. Martin, M. von Davier, & I. V. S. Mullis (Eds.), Methods and Procedures: TIMSS 2019 Technical Report (2nd ed.). TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College and International Association for the Evaluation of Educational Achievement (IEA).

  • Looney, A., O’Leary, M., Scully, D., & Shiel, G. (2022). Cross-national achievement surveys and educational monitoring in Ireland. In L. Volante, S. V. Schnepf, & D. A. Klinger (Eds.), Cross-national achievement surveys for monitoring educational outcomes. Publications Office of the European Union. https://doi.org/10.2760/59628

    Chapter  Google Scholar 

  • Mang, J., Küchenhoff, H., Meinck, S., & Prenzel, M. (2021). Sampling weights in multilevel modelling: An investigation using PISA sampling structures. Large-Scale Assessments in Education, 9(1), 6. https://doi.org/10.1186/s40536-021-00099-0

    Article  Google Scholar 

  • McKeown, C., Denner, S., McAteer, S., & Shiel, G. (2019). Learning for the Future: The performance of 15-Year-olds in Ireland on Reading Literacy, Science and Mathematics in PISA 2018. Educational Research Centre. https://www.erc.ie/wp-content/uploads/2020/07/B23321-PISA-2018-National-Report-for-Ireland-Full-Report-Web-4.pdf. Accessed 28 Nov 2023.

  • McNamara, G., Skerritt, C., O’Hara, J., O’Brien, S., & Brown, M. (2022). For improvement, accountability, or the economy? Reflecting on the purpose(s) of school self-evaluation in Ireland. Journal of Educational Administration and History, 54(2), 158–173. https://doi.org/10.1080/00220620.2021.1985439

    Article  Google Scholar 

  • Meyer, H.-D., & Benavot, A. (2013). PISA, power, and policy: the emergence of global educational governance. Symposium Books.

    Book  Google Scholar 

  • Mullis, I. V. S., & Martin, M. O. (2017). TIMSS 2019 Assessment Frameworks. TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College and International Association for the Evaluation of Educational Achievement (IEA).

  • Mullis, I. V. S., Martin, M. O., Foy, P., Kelly, D. L., & Fishbein, B. (2020). TIMSS 2019: International Results in Mathematics and Science. In e-conversion—Proposal for a Cluster of Excellence. TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA).

  • Muthén, L. K., & Muthén, B. O. (2017). Mplus user’s guide (8th ed.). Muthén & Muthén.

    Google Scholar 

  • Nelis, S., & Gilleece, L. (2023). Ireland’s National Assessments of Mathematics and English Reading 2021: A focus on achievement in urban DEIS schools. Educational Research Centre. https://www.erc.ie/wp-content/uploads/2023/05/B23572-NAMER-DEIS-report-Online.pdf. Accessed 28 Nov 2023.

  • Nelis, S., Gilleece, L., Fitzgerald, C., & Cosgrove, J. (2021). Beyond achievement: home, school and wellbeing findings from PISA 2018 for students in DEIS and non-DEIS schools. Educational Research Centre. https://www.erc.ie/wp-content/uploads/2021/06/FINAL_Web_version_ERC-PISA-DEIS-Report-II_May-2021.pdf. Accessed 28 Nov 2023.

  • Nonte, S., Clerkin, A., & Perkins, R. (2022). An examination of science achievement and school compositional effects in Ireland using TIMSS data. European Journal of Educational Research, 11(4), 2523–2536. https://doi.org/10.12973/eu-jer.11.4.2523

    Article  Google Scholar 

  • OECD. (2018). Equity in education: breaking down barriers to social mobility. PISA, OECD Publishing. https://doi.org/10.1787/9789264073234-en

    Book  Google Scholar 

  • OECD. (2019). PISA 2018 results: Volume 1 What students know and can do. PISA, OECD Publishing. https://doi.org/10.1787/5f07c754-en

    Book  Google Scholar 

  • Perkins, R., & Clerkin, A. (2020). TIMSS 2019: Ireland’s Results in Mathematics and Science. Educational Research Centre. https://www.erc.ie/wp-content/uploads/2020/12/03-ERC-TIMSS-2019-Report_A4_Online.pdf. Accessed 28 Nov 2023.

  • Pitsia, V., Karakolidis, A., & Shiel, G. (2019). High achievement in mathematics and science: A multilevel analysis of TIMSS 2015 data for Ireland [Conference presentation]. 8th International Association for the Evaluation of Educational Achievement (IEA) International Research Conference

  • Ringarp, J. (2016). PISA lends legitimacy: A study of education policy changes in Germany and Sweden after 2000. European Educational Research Journal, 15(4), 447–461. https://doi.org/10.1177/1474904116630754

    Article  Google Scholar 

  • Rowley, K. J., Edmunds, C. C., Dufur, M. J., Jarvis, J. A., & Silveira, F. (2020). Contextualising the achievement gap: Assessing educational achievement, inequality, and disadvantage in high-Income countries. Comparative Education, 56(4), 459–483. https://doi.org/10.1080/03050068.2020.1769928

    Article  Google Scholar 

  • Sciffer, M. G., Perry, L. B., & McConney, A. (2020). Critiques of socio-economic school compositional effects: Are they valid? British Journal of Sociology of Education, 41(4), 462–475. https://doi.org/10.1080/01425692.2020.1736000

    Article  Google Scholar 

  • Sciffer, M. G., Perry, L. B., & McConney, A. (2022). Does school socioeconomic composition matter more in some countries than others, and if so, why? Comparative Education, 58(1), 37–51. https://doi.org/10.1080/03050068.2021.2013045

    Article  Google Scholar 

  • Shiel, G., Kavanagh, L., & Millar, D. (2014). The 2014 National Assessments of English reading and mathematics: Performance report (Vol. 1). Educational Research Centre.

    Google Scholar 

  • Sirin, S. R. (2005). Socioeconomic status and academic achievement: A meta-analytic review of research. Review of Educational Research, 75(3), 417–453. https://doi.org/10.3102/00346543075003417

    Article  Google Scholar 

  • Smyth, E., McCoy, S., & Kingston, G. (2015). Learning from the Evaluation of DEIS. Research Series No. 39. The Economic and Social Research Institute (ESRI)

  • UNESCO. (2018). Handbook on measuring equity in education. UNESCO Institute for Statistics.

    Google Scholar 

  • Weir, S. (2016). Raising achievement in schools in disadvantaged areas. In S. Edgar (Ed.), Successful approaches to raising attainment and tackling inequity (pp. 74–89). Education Scotland.

    Google Scholar 

  • Weir, S., & Archer, P. (2011). A report on the first phase of the evaluation of DEIS. Educational Research Centre. https://www.erc.ie/documents/deis_p1_main.pdf. Accessed 28 Nov 2023.

  • Weir, S., & Denner, S. (2013). The evaluation of the school support programme under DEIS: Changes in pupil achievement between 2007 and 2013. Educational Research Centre.

    Google Scholar 

  • Wilkinson, I. A. G. (2002). Introduction: Peer influences on learning: Where are they? International Journal of Educational Research, 37(5), 395–401. https://doi.org/10.1016/S0883-0355(03)00012-0

    Article  Google Scholar 

  • Woltman, H., Feldstain, A., MacKay, J. C., & Rocchi, M. (2012). An introduction to hierarchical linear modeling. Tutorials in Quantitative Methods for Psychology, 8(1), 52–69. https://doi.org/10.20982/tqmp.08.1.p052

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Contributions

All authors made a substantial contribution to the conception, design, statistical analysis and drafting/writing of this research paper. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Anastasios Karakolidis.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1

: Table S1. School mean HRL by TIMSS cycle. Table S2. Percent of between-school variance in home resources for learning.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Duggan, A., Karakolidis, A., Clerkin, A. et al. Trends in educational inequalities in Ireland’s primary schools: an analysis based on TIMSS data (2011–2019). Large-scale Assess Educ 11, 39 (2023). https://doi.org/10.1186/s40536-023-00188-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40536-023-00188-2

Keywords