- Research
- Open access
- Published:
The substantiveness of socioeconomic school compositional effects in Australia: measurement error and the relationship with academic composition
Large-scale Assessments in Education volume 10, Article number: 21 (2022)
Abstract
This study examines the effect of school socioeconomic composition on student achievement growth in Australian schooling, and its relationship with academic composition utilising the National Assessment Program—Literacy and Numeracy (NAPLAN) dataset. Previous research has found that school composition predicts a range of schooling outcomes. A critique of school compositional research has been that measurement error may have biased findings of compositional effects. Prior studies have found that socioeconomic compositional effect sizes are small when models include academic composition. The relationship between socioeconomic and academic compositions has yet to be fully determined. Multi-level regressions and structural equation models were compared to estimate the degree of bias in socioeconomic compositional effects due to measurement error. Multi-level path models were used to test if academic composition mediated the relationship between socioeconomic composition and achievement growth. The results showed that measurement error did not bias compositional effects in the dataset, and academic composition mediates the relationship between socioeconomic composition and achievement growth. We argue that school value-add research should include academic composition to account for contextual effects. The socioeconomic compositional effect is of practical significance to policy makers and educational researchers due to its relative size compared to average student achievement growth. Potential reforms include ensuring public subsidies to private schools in Australia do not increase school segregation and the amelioration of the effects of residential segregation through school funding reforms.
Introduction
The predictive relationship between student socioeconomic status (SES) and academic outcomes has been well established in educational research. Sirin’s (2005) meta-analysis of the relationship between SES and academic achievement found a mean effect size of r = 0.29. National and international testing programs have consistently found SES achievement gaps. Australia’s National Assessment Program—Literacy and Numeracy (NAPLAN) 2018 administration found that in all assessed academic domains across each year group, higher levels of parental education and occupation were associated with higher levels of academic achievement (ACARA, 2018). The OECD’s Programme for International Student Assessment (PISA) most recent administration found that on average across OECD countries student SES accounted for 12% of the variation in reading achievement (OECD, 2019).
Less well established in the literature are the effects of school socioeconomic composition (SEC) and its relationship with other contextual factors. SEC effects are the predictive relationships between the aggregate socioeconomic status of a student body and individual-level student outcomes (Willms, 2010). Student bodies may consist of classes, cohorts, age groups or schools. SEC effects are conceptualised as the difference in outcomes of students who have the same individual SES associated with attending schools with differing socioeconomic compositions (Raudenbush & Bryk, 2002). That is, school socioeconomic composition has an effect distinct from individual student SES.
Prior research has found that socioeconomic composition has a substantial predictive relationship with academic achievement (Perry & McConney, 2010a, 2010b, 2013; McConney & Perry, 2010a, 2010b; Chesters & Daly, 2015, 2017; Lamb & Fullarton, 2002) and achievement growth (Rumberger & Palardy, 2005), but many gaps remain in the literature. No research has established how SEC interrelates with academic composition, or average school-level prior achievement, in predicting achievement growth. Research findings into the linearity or consistency in the strength of the SEC effect have diverged (Benito et al., 2014; Chiu & Khoo, 2005; Rumberger & Palardy, 2005). Methodological questions also remain as to whether measurement error and selection effects have biased past findings of SEC effects or if SEC effects are of a size to warrant policy interest (Armor et al., 2018; Lauen & Gaddis, 2013; Marks, 2015).
The present study aims to address some of these gaps in our understanding of the socioeconomic compositional effect with the following research questions within the Australian context:
-
(1)
How substantial is the influence of measurement error on socioeconomic school compositional effects?
-
(2)
Does academic composition mediate the relationship between socioeconomic composition and achievement growth?
-
(3)
Is the SEC effect of practical significance?
These research questions aim to locate the potential role of the socioeconomic compositional effect in predicting the equity of the Australian school system. Addressing the issues of measurement error and selection effects indicates whether school effectiveness research should be informed by compositional effects, including the development of value-added measures (Marks, 2021). Testing the nature of the relationship between academic and socioeconomic compositional effects may improve the conceptualisation of why SEC predicts schooling outcomes. Exploring the substantiveness of socioeconomic composition on achievement growth, suggests whether it should be a focus of school policy reforms. Australia offers insights into the role of the socioeconomic compositional effect as a comprehensive or non-tracked schooling system that has engaged in significant market-based reforms (Lubienski et al., 2021), namely substantial increases in public funding to private schools, alongside having one of the most socioeconomically segregated schooling systems within the OECD (2018). As such, it offers potential insights into the segregational effects of market-based reforms on schooling outcomes free from the segregational effects of between-school curriculum tracking.
Background
School compositional effects came to prominence in educational literature following the seminal Coleman Report (Coleman et al., 1966). David Coleman and colleagues investigated the quality of US public education afforded to minority ethnic groups compared to European–American students through relating school characteristics to academic achievement (Coleman et al., 1966). They found that school composition was the largest school-level effect on academic performance. These effects have consistently been found over time across international schooling systems predicting a diverse range of educational outcomes.
International large-scale studies have consistently found that socioeconomic composition is a substantial predictor of student academic achievement. Each cycle of PISA, for example, has found that school-level socioeconomic factors substantively predict academic achievement separate from individual-level SES in most participating countries (OECD, 2003, 2004, 2007, 2010, 2013, 2016, 2019). Secondary analyses of PISA datasets have shown an inverse correlation between the degree of socioeconomic school segregation and national levels of academic achievement (Chiu & Khoo, 2005; Willms, 2010) and factors that may mediate the relationship between composition and achievement (Liu et al., 2015).
Socioeconomic composition has been shown to predict a range of schooling outcomes. Both academic achievement (Lamb & Fullarton, 2002) and achievement growth (Rumberger & Palardy, 2005) across a broad range of subject domains are related to SEC. Academic achievement is predicted by SEC at both primary (Sofroniou et al., 2004) and secondary school levels (Chesters & Daly, 2017). Other school outcomes that have been associated with SEC have been high school graduation and college attendance (Chesters, 2019; Palardy, 2013) and college choice (Palardy, 2015). Boonen et al. (2014), however, found no statistically significant main effects of a range of compositional effects on achievement growth in early primary school.
Reported effect sizes for socioeconomic composition vary widely in the literature. Van Ewijk and Sleegers’ (2010) meta-analysis of cross-sectional studies found effect sizes ranged 0.03 to 0.59 standard deviations with an average of 0.32. The 2015 PISA administration reported that on average, SEC accounted for 62.6% of variation in academic achievement due to schools (OECD, 2016). Longitudinal studies have tended to find smaller effect sizes for SEC as they control for selection effects where prior achievement is positively correlated with SEC. That is, students with initial higher achievement tend to self-select into schools with higher SEC in systems where such selection is possible. Rumberger and Palardy’s (2005) analysis of the National Education Longitudinal Survey of 1988 found SEC effect sizes ranged 0.05 to 0.21 standard deviations depending on the academic domain. Marks’ (2015) study with primary school samples found smaller SEC effects ranging from 0.00 to 0.05 standard deviations, arguing they were of insignificant size to warrant policy reforms.
The relationship with academic composition
Academic composition has been another school-level factor in school effectiveness research. Its conceptualisation, or what it represents, has not been well-defined by the literature, but it is usually measured as average or school-level prior achievement (Marks, 2021). It has been argued that academic composition predicts student-level achievement outcomes due to influencing teacher expectations of student abilities and students’ expectations of themselves (Scheerens et al., 2001). Best practice in modelling school compositional effects includes joint modelling of academic and socioeconomic composition and the relationship between them (Thrupp et al., 2002).
Prior research has found a strong correlation between academic and socioeconomic composition, but the nature of the relationship remains an open question in the literature (Harker & Tymms, 2004; Thrupp et al., 2002). Models that have included academic and socioeconomic composition as simultaneous predictors of academic achievement or growth have found small-to-zero effect sizes for the SEC effect (Dumay & Dupriez, 2008; Lauder et al., 2010; Marks, 2015). This suggests that academic context may mediate socioeconomic context. That is, SEC captures the sorting of students into schools according to prior achievement due to socioeconomic segregation. This sorting process varies the academic composition of schools, which in turn varies academic outcomes.
Compositional effects are a distal explanation of school effectiveness, providing a broad explanation of why school contexts predict school effectiveness. Jencks and Mayer’s (1990) theoretical analysis outlined three types of models that explained how children and young people’s academic performance might be influenced by the socioeconomic composition of their neighbourhoods and schools. Collective socialisation models posit that non-paternal adults from a child’s neighbourhood influence the attitudes, beliefs and behaviours of young people (Jencks & Mayer, 1990). Socioeconomically advantaged adults provide a social milieu in which young people learn to value academic performance and the social skills needed for success at school (Jencks & Mayer, 1990; Sui-Chu & Willms, 1996). Epidemic or peer effects models posit that peers influence each other’s beliefs, values, attitudes and behaviours (Jencks & Mayer, 1990). Exposure to middle class peers who value academic performance and model the skills for its success may induce similar values and skills in their peers in a form of social contagion (Bankston & Caldas, 1996; Jencks & Mayer, 1990). Institutional models posit that the key social institutions within a neighbourhood, such as schools, differ in their quality of service depending on the socioeconomic composition of a neighbourhood (Jencks & Mayer, 1990). When researching schools, institutional models look to differences in the experiences young people have of social institutions in differing socioeconomic contexts (Jencks & Mayer, 1990). Institutional and peer effects models have dominated mediational models of school compositional effects research (Liu et al., 2015; Palardy, 2008, 2013; Rumberger & Palardy, 2005; Willms, 2010).
Appropriately specifying models of socioeconomic composition
Limitations of school compositional research have been the potential for selection effects (Lauen & Gaddis, 2013) and measurement error (Marks, 2015) to upwardly bias compositional effects. Selection bias occurs when unmeasured systematic differences between groups confound the relationship between an independent variable and a dependent variable. For example, families with high educational goals for their children may seek to enrol them in schools with good reputations. Due to demand constraints, it is more likely such schools will have above average socioeconomic composition, such as public schools in expensive suburbs or high-fee private schools. Thus, SEC may act as a proxy for unmeasured factors that explain higher achievement in higher SEC schools. Controlling for prior achievement may partially address selection effects by accounting for the clustering of higher ability, or more motivated, students into higher SEC schools (Hallberg et al., 2018). While longitudinal studies account for biases in prior achievement, achievement growth may also be biased by selection effects as growth rates may differ according to selection effects. One method to mitigate selection bias is to include variables that may capture the probability of school selection (Palardy, 2013; Rangvid, 2007). Controlling for school sector in Australian samples likely captures much of the potential selection effect because of the high degree of school choice in Australia (ABS, 2017; Rowe, 2020). Family selection of private schools is an indicator of family academic aspiration in Australia (Warren, 2016).
A critique of school compositional effects research has been that measurement error in SES indicators may lead to “phantom” or spurious socioeconomic compositional effects in multilevel models. Two similar mechanisms have been posited for phantom compositional effects. Firstly, measurement error may deflate individual-level effect sizes while minimally influencing group-level effects (Gorard, 2006; Harker & Tymms, 2004). As compositional effects are the difference between student and aggregated effects (Raudenbush & Bryk, 2002, pp. 139–141), the compositional effect is inflated. Secondly, it has been found that measurement error deflates individual-level effects alongside inflating group-level effects in common modelling scenarios (Pokropek, 2015).
Measurement models derived from factor analytic approaches are an effective method to address measurement errors (Fan, 2003; Muthén, 1991). Factor analysis derives latent variables from the common variance of indicators, partitioning out error and unique variance. The doubly-latent structural equation modelling (SEM) approach to modelling compositional effects (Marsh et al., 2009) affords a means by which compositional effects research can be conducted free from potential measurement error biases. Simulation (Pokropek, 2015) and applied (Televantou et al., 2015) studies have shown the doubly-latent method reliably estimates compositional effects free from biases due to individual-level measurement errors.
A theoretical difficulty with applying factor analytic approaches to the development of SES composite measures arises from the assumed direction of causality between composite measures of SES and its indicators. Factor analysis assumes that latent variables cause variation in the associated reflective indicator variables (Bollen & Bauldry, 2011). That would mean SES causes changes in its underlying indicators such as parental education and occupation. This is inconsistent with how SES is operationalised in the majority of educational research as a convenient summary of measures associated with SES (Avvisati, 2020).
Principal components analysis (PCA) is a common method for constructing composite measures of SES (NCES, 2012, pp. 22–24; OECD, 2017, pp. 339–340). Unlike factor analysis that derives latent factors from the shared variance of indicators, PCA derives components from the total variance of underlying variables (Dunteman, 1989, p. 55). As such, composite measures derived from PCA capture the measurement error of the underlying variables, thus potentially biasing findings when used as predictors in models.
The standard factor analytic practice of testing model fit is not applicable to SES composite measures. As a convenient summary of SES indicators, it is not expected that SES composite measures exhibit a unitary factor structure. Instead, such measures are developed to simplify the modelling and reporting of SES effects (NCES, 2012, p. 22) through capturing multiple independent social processes that determine a person’s place in a social hierarchy. Thus, poor model fit would indicate SES composite measures represent an intended set of independent constructs.
The non-unitary assumption of SES composite measures is partly due to the lack of agreement and progress in conceptualising SES (Marks & O’Connell, 2021). Researchers and policy makers are often dependent on pre-existing indicators of SES that are conveniently accessible from school administrative processes, including the current paper. Whilst outside of the scope of this paper, much more work is required to develop both the theory and the measurement of socioeconomic status.
This study applies a novel approach to address the potential biases associated with measurement errors in SES measures. It evaluates the substantiveness of the influence of measurement error on SES and SEC effects in models applied to the same sample. Simulations of increasing measurement error are compared across PCA and the doubly-latent structural equation approaches. If the SES and SEC effect sizes are comparable between the two approaches at low levels of measurement error, it can be assumed that measurement error does not bias models using PCA scores for composite measures of SES. Therefore, such models can be applied to research questions.
Method
Participants
The research questions were examined through a secondary analysis of the 2017 student-level de-identified NAPLAN dataset. NAPLAN is Australia’s annual population assessment of academic achievement and growth in reading, writing, spelling, grammar and numeracy for students in grades three (313,807 students from 7820 schools), five (311,412 students from 7809 schools), seven (288,946 students from 3422 schools) and nine (281,280 students from 3064 schools). Being a population assessment, it is representative of the diversity of school contexts in Australia.
The Australian commonwealth is a federation of six states and two territories. The funding and administration of compulsory schooling is shared across federal and state levels of government. Australia’s schooling system consists of primary, kindergarten to grade six, and secondary, grade seven to grade twelve, school levels. It is a comprehensive system with no between-school tracking at any age. The majority of students attend state-funded public schools administered by state government education departments. Thirty-five percent of students attend private schools which are administered by independent bodies such as Catholic dioceses, individual churches or parent boards. A national curriculum exists with variations of implementation across each state. Within each state, the same curriculum and teaching standards exist across public and private schools. Public schools do not charge compulsory fees, and offer open enrolment based on catchment areas, except for a very small subset of selective public schools whose enrolments are based on academic test scores. Private schools determine their own enrolment and exclusion policies and charge fees. As such, the majority of students from low and middle-income families attend public schools whereas private schools tend to limit their student bodies to those from middle-to-high income families.
We used 2017 grade five (and the same cohort’s grade three scores from 2015), and 2017 grade nine (and the same cohort’s grade seven scores from 2015) scores in our analysis. In this way, we were able to examine both socioeconomic and academic compositional effects in relation to achievement growth over 2 years of primary and secondary school levels. Students who changed schools between the two measurement occasions were excluded from the analysis to avoid confounding SEC across differing schools. NAPLAN is administered by the Australian Curriculum, Assessment and Reporting Authority (ACARA). The NAPLAN dataset allows for the examination of socioeconomic and academic compositional effects on achievement growth in primary and secondary schools. As a population dataset, NAPLAN also enables the use of descriptive statistical methods to make generalisable comparisons among differing demographic groups.
Measures
The dependent variables were reading, writing, spelling, grammar and punctuation, and numeracy in the NAPLAN dataset. NAPLAN tests are designed to “broadly reflect aspects of literacy and numeracy within the curriculum in all jurisdictions” (ACARA, 2018, p. iv). The knowledge and skills assessed are drawn from the Australian Curriculum: English and Mathematics and the literacy and numeracy general capabilities of the Australian curriculum (ACARA, 2017). The Australian curriculum defines literacy as “the capacity to interpret and use language features, forms, conventions and text structures in imaginative, informative, and persuasive texts” and numeracy as “the knowledge and skills to use mathematics confidently across all learning areas at school and in their lives more broadly” (ACARA, 2017, pp. 6–7).
SES was operationalised as PCA composite scores for each research question. To evaluate potential measurement error in research question one, SES was also operationalised through SEM. The SES composite was derived from four indicator variables of parental self-reports of occupation and education which were extracted from school enrolment forms at the time of enrolment in Kindergarten in primary school and grade seven in secondary school. A time gap of up to 5 years in primary school and 2 years in secondary school exists between enrolment data collection and the NAPLAN assessments. Very few parents are likely to change their highest years of schooling or occupation within those time periods, and the occupational groups are broad, thus little measurement error would be introduced by time lags between SES indicator data collection and academic assessment.
The raw measures were ordinal scales of mothers’ and fathers’ education, and occupation. Table 1 shows how we mapped the ordinal measures of SES onto interval scales. Highest education was re-coded into years of schooling. Occupation was re-coded onto the average Australian Socioeconomic Index 2006 (AUSEI06) for the occupational category. AUSEI06 is an interval occupational status scale for the occupational classification categories of the Australian Bureau of Statistics (ABS) (McMillan et al., 2009). Each NAPLAN occupational category consisted of multiple ABS occupational categories (ACARA, 2019). The average AUSEI06 score for each of ABS occupational categories was assigned to the four NAPLAN occupational codes. In cases where parents reported as being outside of the workforce, AUSEI06 scores were imputed from parental education (McMillan et al., 2009, p. 132).
SEC was operationalised as school-average SES. Academic composition was operationalised as a school-level latent factor indicated by school average scores in all five NAPLAN academic domains of prior achievement in grades three or seven (see Fig. 1). Average, or manifest aggregate, scores were utilised for compositional variables for research questions 2 and 3 as NAPLAN is a population dataset and there is thus no need to adjust for aggregation error through latent aggregation (Lüdtke et al., 2008).
Potential selection effects were addressed with prior achievement scores and control variables for sex, indigeneity, language background other than English, and school type (public or private), which is a key mechanism of school choice in Australia (Rowe, 2020). Descriptive statistics for all variables are in the Additional file 1: Tables S1 and S2.
Procedure
Multilevel residualised-change regressions and SEMs were compared to answer research question 1. Residualised-change models are two-occasion growth models where the prior score on a dependent variable is included as a covariate of the dependent variable (Gollwitzer et al., 2014). When academic achievement is the dependent variable, the resulting coefficients of the predictor variables provide estimates of their influence on achievement growth. Equation (1) represents the residualised-change model this study used in research question 1 with PCA scores.
For student i in school j, \({Y}_{i2j2}\) was academic achievement, \({\beta }_{0}\) was the intercept, \({\beta }_{1}\) was the coefficient for prior achievement, \({\beta }_{2}\) was the coefficient of student SES, \({\beta }_{3}\) was the coefficient of school-average SES, \({\delta }_{0j}\) was the school-level residual variance and \({\varepsilon }_{ij}\) was the student-level residual variance. Academic growth was modelled over 2 years of secondary school, grades seven to nine, and 2 years of primary school, grades three to five.
Multilevel residualised-change SEMs were constructed to answer research question 1. Such models had the same structural relations as Eq. (1) but modelled SES and SEC as latent factors and aggregated SEC through latent aggregation (Marsh et al., 2009). Equation (2) represents the residualised-change SEM this study used in research question 1. In this case \({U}_{ij}\) is the latent variable of SES and \({U}_{j}\) is the latent variable of SEC derived from latent aggregation.
The comparison of the substantiveness of the influence of measurement error on SEC effects in PCA score and SEM models was performed by adding increasing levels of random variation to the indicators of SES. The proportion of additional error in each indicator ranged from 0 to 90%, increasing in increments of 10%. If the comparison revealed little-to-no differences between the effect sizes of SES and SEC of PCA score and SEMs at 0% introduced error, we argue it is appropriate to use PCA scores in multilevel models of SEC effects. In such a scenario we moved on to the rest of the research questions.
The mediation of SEC by academic composition was tested through a multilevel path analysis as shown in Fig. 1, to test research question 2. The path analysis was an extension of Eq. (1) with the addition of academic composition mediating the SEC effect. Prior achievement at the student-level was the same domain of achievement as the dependent variable. Academic composition was measured as a latent construct to capture the error-free shared variance of all school-average prior achievement scores. Such a factor represents the academic composition of each grade five or nine cohort within each school.
The substantiveness of the SEC effect was assessed by comparing the unstandardised coefficients of the indirect path of SEC via academic composition to population average achievement growth over 2 years in the same academic domain. This effect size measure enables policy makers and the broad educational community who may not have a background in educational statistics to readily judge the practical importance of school compositional effects. A one unit change, equivalent to one standard deviation, in SEC was used as the basis of the SEC effect as it represents the difference between low and middle, or middle and high, SEC schools. That is, what would be expected with a substantially different degree of school segregation in the population.
Statistical modelling was performed with MPlus. Missing data were handled with the full information maximum likelihood (FIML) method with robust standard errors. It was assumed data were missing at random (MAR) as NAPLAN is a compulsory assessment of all Australian students in grades 3, 5, 7 and 9, excluding a small set of students with intellectual disabilities or less than 1 year’s exposure to the English language. Missingness due to SES can be handled using FIML modelling under the MAR assumption as SES is included in the model. Students missing data on the dependent variable were due to either being absent or withdrawn. Absent students were those who do not participate in a test due to non-attendance at school on the test date and catch-up periods. Causes of absences likely include extended student illness, family travel or movement, and chronic absenteeism. These factors are likely random except for chronic absenteeism, which likely covaries with SES or Indigenous status, both of which are modelled. Withdrawn students are those whose parents apply to withdraw their child from the test due to religious or philosophical reasons, representing 2.3% of grade 5 students and 2.0% of grade 9 students. This low level of missingness at the student level is safe to ignore in a large population-level dataset where the focus of analysis is on group-level relationships. Sample sizes and missing data rates are in Additional file 1: Tables S1 and S2.
FIML has advantages over listwise deletion in that it produces unbiased parameter estimates if data are MAR and has greater power (Enders, 2010, p. 87). An additional advantage of the FIML estimator implemented in MPlus is that it calculates standard errors that are robust to deviations from normality (Muthén & Muthén, 2017, p. 668). PCA scores were calculated with the ‘psych’ package (Revelle, 2021) in R.
Results
We found that measurement error does not appear to inflate SEC effects in the NAPLAN sample, the SEC effect was mediated by academic composition, and the SEC effect is of a size as to warrant further research and policy responses.
Figure 2 graphs the grade 9 2017 NAPLAN dataset. It was observed that both SES and SEC predicted the likelihood of students’ achievement of national minimum benchmarks across all academic domains. For each student SES category, SEC demonstrated a positive relationship with academic achievement. Low SES students in high SEC schools were 2.05 times more likely to achieve minimum benchmarks than low SES students in low SEC schools. The same relationship for middle and high SES students were 1.83 and 1.30 respectively. This indicates that low SES students may be more sensitive to the effects of SEC than high SES students. Non-linearity is also apparent in Fig. 2. The predictive strength of SEC increases as SEC increases with low SES students, whereas it weakens with high SES students. It should be noted that Fig. 2 is the raw descriptive statistics of academic achievement, thus are likely biased by selection effects.
Figure 3 compares the influence of increasing measurement error in SES indicators in PCA and SEMs in grades 5 and 9. Results are reported as standardised coefficients based on total model variance. In PCA models the lines of best fit tended to show that the inflation of the SEC effect peaked when the proportion of random variance in SES indicators was 70%. Indicator error variance greater than 70% would be very unlikely in educational research. As such, we will use this as a comparative point with 0% random variance for the two modelling methods. At 70% error variance, on average the SEC effect was inflated by 37% with PCA scores and − 8% with SEM. Respective average SES deflation effects were 63% and − 1%. This is consistent with prior research showing that SEM exhibits much lower bias from measurement error in school compositional effects models (Pokropek, 2015; Televantou et al., 2015). With our sample, SEC effects were slightly downwardly biased by measurement error in SEM models at 70% error variance.
The average absolute difference between SES and SEC standardised coefficients in PCA and SEMs at 0% error was 0.002 SD. It can also be observed in Fig. 3 that the curve of the lines of best fit of PCA coefficients levels off as it approaches 0% measurement error. Thus, in the NAPLAN sample, measurement error does not appear to bias coefficients of SES and SEC in multilevel models. As such, we utilise PCA scores in the rest of our modelling.
Figures 4 and 5 show that academic composition mediated the relationship between SEC and achievement growth across all domains in grades five and nine. All indirect paths from SEC to achievement growth were statistically significant with standardised coefficients averaging 0.32 SD at grade five and 0.48 SD at grade nine.
Tables 2, 3, 4 and 5 reports the results of the paths not in Figs. 4 and 5 and overall model fit indices.
Figure 6 presents the unstandardised coefficients (indirect effects) of SEC and average achievement growth across each academic domain. It compares the effect of a 1 SD change in SEC compared to average achievement growth. It shows that SEC accounted for a larger proportion of achievement growth in secondary than primary schools. This was due to average achievement growth being larger in primary schools and the SEC effect tending to be larger in secondary schools. On average, SEC effects were equivalent to 11% of average achievement growth in primary schools and 31% of average achievement growth in secondary schools.
The SEC effect may be larger in secondary schools due to Australian secondary schools having a greater degree of socioeconomic segregation than primary schools. In the 2017 NAPLAN dataset, 31% of grade five students attended private schools whereas 43% of grade nine students attended private schools, consistent with the historical pattern of Australian parents being more likely to exercise school choice at the secondary than primary school levels. The difference between the average scores of SEC of public and private school students was 42% larger in grade nine than it was in grade five. That is, the systemic socioeconomic differences between public and private schools in Australia are greater in secondary schools than primary schools.
Discussion
Our simulations of compositional effects models shows that measurement error does not bias the coefficients of SES and SEC in multilevel models in the NAPLAN dataset. This is potentially the case in many other international large-scale assessment datasets as well, suggesting that prior critiques of bias in school compositional research may have been misplaced. While increasing levels of measurement error does bias multilevel models of compositional effects, only a comparison of standardised coefficient sizes between methods that do, and do not, account for measurement error can determine the level of potential bias in compositional effects. Our method of comparing SEMs with multilevel models utilising measures derived from PCA suggests a way for researchers and the administrators of large-scale assessments to test for the degree of measurement error in composite SES variables. Comparing such models indicates the degree to which components scores may differ from true scores.
Our finding that academic composition mediates the SEC effect provides some clarity to the relationship between the two compositional effects in schooling systems highly influenced by parental choice policies. One explanation may be that SEC represents the selection of students of differing prior academic histories into differing schools. That is, socioeconomically advantaged students who enjoy greater academic opportunities tend to self-select into schools with other students of the same background. The composition of such schools may be more conducive to learning due to teachers having higher expectations of student achievement, and/or due to peer effects where students develop higher academic expectations of themselves. Another explanation may be that academic composition captures a range of compositional effects, including SEC, which are highly correlated, yet offer at least partial explanations for school contextual effects. Thus, more research is needed to determine the relationship among a range of compositional effects and why they may jointly predict schooling outcomes. Additionally, in schooling systems where segregation is driven more by curriculum tracking, such as in many European countries, the relationship between academic composition and SEC may differ.
If academic composition mediates the SEC effect due to the sorting of students by differing academic histories, then the academic compositional effect is likely a stronger and more reliable explanation for school effects in the Australian context due to it being more proximal to school-level effects than the social contexts of students in our models. At the same time, the role of SEC in predicting academic segregation cannot be ignored as it indicates the policy and social causes of academic segregation. School value-add research that seeks to identify explanations for teacher, school and systemic effects should control for selection and compositional effects by including measurement of academic composition. This would mitigate the confounding of school value-add with compositional effects over which many schools have little influence.
Parental choice of government subsidised private schools is a key driver of the SEC effect in Australia and internationally (Alegre & Ferrer, 2010; Sciffer et al., 2022). Australia has the fourth largest proportion of private school enrolments within the OECD (OECD, 2020), while the capacity of Australian private schools to exclude and expel students with no parliamentary oversight is associated with one of the most segregated schooling systems within the OECD (OECD, 2018). Recent Australian research showing that private schools add no value to the trajectories of student learning (Larsen et al., 2022) indicates that the taxation burden of private schools has added no value to overall student achievement while at the same time increasing social inequality. While calls have been made to require private schools to be socially representative of the community (Bonnor et al., 2021; Greenwell & Bonnor, 2022), a more politically feasible approach may be to require all schools to be academically representative of the communities from which they draw students. That is, publicly funded private schools could be required to enrol a diverse academic mix of students and be barred from expelling students based on academic achievement.
Beyond school choice polices, spatial segregation also contributes to socioeconomic school segregation (Smith et al., 2018). The residential clustering of families according to SES has multiple causes, many beyond the influence of education policies. This suggests that amelioratory education policies are also needed to address school compositional effects. For example, funding reforms may partially address the deleterious effects of concentrations of disadvantage (Lafortune et al., 2018). Adjustments to tertiary entrance student rankings based on school demographic factors may improve the enrolment representativeness of elite universities.
The socioeconomic compositional effect is of substantive practical significance in Australia. On average across all assessed academic domains, the difference in student achievement growth between low and middle SEC schools predicted by socioeconomic composition, as mediated by academic composition, is equivalent to 11% of average achievement growth in primary schools and 31% of achievement growth in secondary schools. Assuming a constant rate of learning, this is equivalent to almost one term of learning in grades 3–5 and over two terms of learning in grades 7–9. School-level reforms that ignore socioeconomic composition are unlikely to succeed, especially given that research (Palardy, 2008) has found differential effectiveness of school practices according to SEC.
A limitation of our study was its sole consideration of socioeconomic and academic compositional effects. Other compositional effects that have been found include intellectual ability (Opdenakker & Damme, 2001), ethnicity (Caldas & Bankston, 1998) and migration status (Peetsma et al., 2006). Some of these compositional effects may overlap or explain some of the variance due to SEC. A second limitation of our study was our measure of SES in the NAPLAN dataset. Both parental occupation and education had limited categories of differentiation compared to other large-scale studies such as PISA, while there was no measure of family wealth. As such, the predictive relationships of SES and SEC with achievement growth may have been underestimated due to limiting the variance SES and excluding an important socioeconomic predictor of educational outcomes (Chesters, 2019; Marks et al., 2006). A third limitation of our study design was that it did not measure class-level effects. The NAPLAN dataset allows the measurement of cohort effects while the PISA dataset allows the measurement of school effects. A three-level model of achievement growth measuring variance at student, classroom and school levels may find that a proportion of the student-level variance found in our residualised-change models may be better explained by classroom composition (Lamb & Fullarton, 2002).
Many avenues of future research would advance our understanding of socioeconomic compositional effects. As mentioned, our understanding of the mediators of SEC and achievement growth is not fully developed (Marks, 2015). Such knowledge may be able to guide interventions aimed at ameliorating SEC effects. Palardy’s (2008, p. 26) model of school effects may provide a framework for future studies of mediators of SEC effects. SEM and path analysis may provide researchers a flexible means to identify mediators of compositional effects (Preacher et al., 2010, 2011).
Conclusion
This study has applied a novel simulation strategy to show that measurement error is minuscule in SES indicators in the NAPLAN dataset and it does not bias coefficients of school composition in multilevel models. This provides confidence in the use of PCA in the development of composite measures of SES and SEC in datasets where indicators are drawn from parental self-reports. The study also found that academic composition mediates the relationship between socioeconomic composition and achievement growth. This may be due to socioeconomic compositional measures representing the selection process of academically advantaged students tending to enrol in schools with similar students, or academic composition capturing a range of correlated compositional effects. In either case, school value-add researchers should consider including academic compositional effects to control for contextual effects over which many schools have little influence. The socioeconomic compositional effect is of practical significance to policy makers and educational researchers as it is of a substantial size compared to average achievement growth. This suggests that reforms to public policies that contribute to socioeconomic school segregation, and the amelioration of the deleterious effects of residential segregation, are needed to improve the equity of the Australian schooling system.
Availability of data and materials
The data that support the findings of this study are available from the Australian Curriculum, Assessment and Reporting Authority (ACARA) but restrictions apply to the availability of these data, which were used under license for the current study, and so are not publicly available. Data are however available from the authors upon reasonable request and with permission of ACARA.
Abbreviations
- ABS:
-
Australian Bureau of Statistics
- ACARA:
-
Australian Curriculum, Assessment and Reporting Authority
- AUSEI06:
-
Australian Socioeconomic Index 2006
- CFI:
-
Comparative Fit Index
- FIML:
-
Full Information Maximum Likelihood
- MAR:
-
Missing at Random
- NAPLAN:
-
National Assessment Program, Literacy and Numeracy
- NCES:
-
National Center for Educational Statistics
- NESB:
-
Non-English Speaking Background
- NS:
-
Non Significant
- OECD:
-
Organisation for Economic Co-operation and Development
- PCA:
-
Principal Components Analysis
- PISA:
-
Programme for International Student Assessment
- RMSEA:
-
Root Mean Square Error of Approximation
- SD:
-
Standard Deviation
- SEC:
-
Socioeconomic Composition
- SEM:
-
Structural Equation Modelling
- SES:
-
Socioeconomic Status
- SRMR:
-
Standardised Root Mean Square Residual
- TLI:
-
Tucker–Lewis Index
References
Alegre, M. À., & Ferrer, G. (2010). School regimes and education equity: Some insights based on PISA 2006. British Educational Research Journal, 36(3), 433–461. https://doi.org/10.1080/01411920902989193
Armor, D. J., Marks, G. N., & Malatinszky, A. (2018). The Impact of school SES on student achievement: Evidence from U.S. statewide achievement data. Educational Evaluation and Policy Analysis, 40(4), 613–630. https://doi.org/10.3102/0162373718787917
Australian Bureau of Statistics. (2017). 4221.0—Schools, Australia, 2017. https://www.abs.gov.au/ausstats/abs@.nsf/Lookup/4221.0main+features22017
Australian Curriculum, Assessment and Reporting Authority. (2017). Assessment framework: NAPLAN online 2017–2018. ACARA. https://www.nap.edu.au/docs/default-source/default-document-library/naplan-assessment-framework.pdf
Australian Curriculum, Assessment and Reporting Authority. (2018). NAPLAN achievement in reading, writing, language conventions and numeracy: National report for 2018. ACARA. https://www.nap.edu.au/results-and-reports/national-reports
Australian Curriculum, Assessment and Reporting Authority. (2019). Data standards manual: Student background characteristics. ACARA. https://www.acara.edu.au/reporting/data-standards-manual-student-background-characteristics
Avvisati, F. (2020). The measure of socio-economic status in PISA: A review and some suggested improvements. Large-Scale Assessments in Education, 8(1), 8. https://doi.org/10.1186/s40536-020-00086-x
Bankston, C., & Caldas, S. J. (1996). Majority African American schools and social injustice: The influence of de facto segregation on academic achievement. Social Forces, 75(2), 535–555. https://doi.org/10.2307/2580412
Benito, R., Alegre, M. A., & Gonzàlez-Balletbò, I. (2014). School segregation and its effects on educational equality and efficiency in 16 OECD comprehensive school systems. Comparative Education Review, 58(1), 104–134. https://doi.org/10.1086/672011
Bollen, K. A., & Bauldry, S. (2011). Three Cs in measurement models: Causal indicators, composite indicators, and covariates. Psychological Methods, 16(3), 265–284. https://doi.org/10.1037/a0024448
Bonnor, C., Kidson, P., Piccoli, A., Sahlberg, P., & Wilson, R. (2021). Structural failure: Why Australia keeps falling short of our educational goals. UNSW Gonski Institute. https://www.gie.unsw.edu.au/structural-failure-why-australia-keeps-falling-short-its-educational-goals
Boonen, T., Speybroeck, S., de Bilde, J., Lamote, C., Van Damme, J., & Onghena, P. (2014). Does it matter who your schoolmates are? An investigation of the association between school composition, school processes and mathematics achievement in the early years of primary education. British Educational Research Journal, 40(3), 441–466. https://doi.org/10.1002/berj.3090
Caldas, S. J., & Bankston, C. (1998). The inequality of separation: Racial composition of schools and academic achievement. Educational Administration Quarterly, 34(4), 533–557. https://doi.org/10.1177/0013161x98034004005
Chesters, J. (2019). Alleviating or exacerbating disadvantage: Does school attended mediate the association between family background and educational attainment? Journal of Education Policy, 34(3), 331–350. https://doi.org/10.1080/02680939.2018.1488001
Chesters, J., & Daly, A. (2015). The determinants of academic achievement among primary school students: A case study of the Australian Capital Territory. Australian Journal of Labour Economics, 18(1), 131–144.
Chesters, J., & Daly, A. (2017). Do peer effects mediate the association between family socio-economic status and educational achievement? Australian Journal of Social Issues, 52(1), 63–77. https://doi.org/10.1002/ajs4.3
Chiu, M. M., & Khoo, L. (2005). Effects of resources, inequality, and privilege bias on achievement: Country, school, and student level analyses. American Educational Research Journal, 42(4), 575–603. https://doi.org/10.3102/00028312042004575
Coleman, J. S., Campbell, E., Hobson, C., McPartland, J., Mood, A., Weinfeld, F., & York, R. (1966). Equality of educational opportunity (Report No. OE-38001). US Government Printing Office.
Dumay, X., & Dupriez, V. (2008). Does the school composition effect matter? Evidence from Belgian data. British Journal of Educational Studies, 56(4), 440–477. https://doi.org/10.1111/j.1467-8527.2008.00418.x
Dunteman, G. H. (1989). Principal components analysis. Sage. https://doi.org/10.4135/9781412985475
Enders, C. K. (2010). Applied missing data analysis. Guilford Press.
Fan, X. (2003). Two approaches for correcting correlation attenuation caused by measurement error: Implications for research practice. Educational and Psychological Measurement, 63(6), 915–930. https://doi.org/10.1177/0013164403251319
Gollwitzer, M., Christ, O., & Lemmer, G. (2014). Individual differences make a difference: On the use and the psychometric properties of difference scores in social psychology. European Journal of Social Psychology, 44(7), 673–682. https://doi.org/10.1002/ejsp.2042
Gorard, S. (2006). Is there a school mix effect? Educational Review, 58(1), 87–94. https://doi.org/10.1080/00131910500352739
Greenwell, T., & Bonnor, C. (2022). Waiting for Gonski: How Australia failed its schools. University of New South Wales Press.
Hallberg, K., Cook, T. D., Steiner, P. M., & Clark, M. H. (2018). Pretest measures of the study outcome and the elimination of selection bias: Evidence from three within study comparisons. Prevention Science, 19(3), 274–283. https://doi.org/10.1007/s11121-016-0732-6
Harker, R., & Tymms, P. (2004). The effects of student composition on school outcomes. School Effectiveness and School Improvement, 15(2), 177–199. https://doi.org/10.1076/sesi.15.2.177.30432
Jencks, C., & Mayer, S. E. (1990). The social consequences of growing up in a poor neighborhood. In L. E. Lynn & M. F. H. McGeary (Eds.), Inner-city poverty in the United States (pp. 111–186). The National Academies Press.
Lafortune, J., Rothstein, J., & Schanzenbach, D. W. (2018). School finance reform and the distribution of student achievement. American Economic Journal: Applied Economics, 10(2), 1–26. https://doi.org/10.1257/app.20160567
Lamb, S., & Fullarton, S. (2002). Classroom and school factors affecting mathematics achievement: A comparative study of Australia and the United States using TIMSS. Australian Journal of Education, 46(2), 154–171. https://doi.org/10.1177/000494410204600205
Larsen, S., Forbes, A. Q., Little, C. W., Alaba, S. H., & Coventry, W. L. (2022). The public-private debate: School sector differences in academic achievement from year 3 to year 9? Australian Educational Researcher. https://doi.org/10.1007/s13384-021-00498-w
Lauder, H., Kounali, D., Robinson, T., & Goldstein, H. (2010). Pupil composition and accountability: An analysis in English primary schools. International Journal of Educational Research, 49(2), 49–68. https://doi.org/10.1016/j.ijer.2010.08.001
Lauen, D. L., & Gaddis, S. M. (2013). Exposure to classroom poverty and test score achievement: Contextual effects or selection? American Journal of Sociology, 118(4), 943–979. https://doi.org/10.1086/668408
Liu, H., Van Damme, J., Gielen, S., & Van Den Noortgate, W. (2015). School processes mediate school compositional effects: Model specification and estimation. British Educational Research Journal, 41, 423–447. https://doi.org/10.1002/berj.3147
Lubienski, C., Perry, L. B., Kim, J., & Canbolat, Y. (2021). Market models and segregation: Examining mechanisms of student sorting. Comparative Education, 58(1), 16–36. https://doi.org/10.1080/03050068.2021.2013043
Lüdtke, O., Marsh, H. W., Robitzsch, A., Trautwein, U., Asparouhov, T., & Muthén, B. (2008). The multilevel latent covariate model: A new, more reliable approach to group-level effects in contextual studies. Psychological Methods, 13(3), 203–229. https://doi.org/10.1037/a0012869
Marks, G. N. (2015). Are school-SES effects statistical artefacts? Evidence from longitudinal population data. Oxford Review of Education, 41(1), 122–144. https://doi.org/10.1080/03054985.2015.1006613
Marks, G. N. (2021). Should value-added school effects models include student- and school-level covariates? Evidence from Australian population assessment data. British Educational Research Journal, 47(1), 181–204. https://doi.org/10.1002/berj.3684
Marks, G. N., Cresswell, J., & Ainley, J. (2006). Explaining socioeconomic inequalities in student achievement: The role of home and school factors. Educational Research and Evaluation, 12(2), 105–128. https://doi.org/10.1080/13803610600587040
Marks, G. N., & O’Connell, M. (2021). Inadequacies in the SES–Achievement model: Evidence from PISA and other studies. Review of Education, 9(3), e3293. https://doi.org/10.1002/rev3.3293
Marsh, H. W., Lüdtke, O., Robitzsch, A., Trautwein, U., Asparouhov, T., Muthén, B., & Nagengast, B. (2009). Doubly-latent models of school contextual effects: Integrating multilevel and structural equation approaches to control measurement and sampling error. Multivariate Behavioral Research, 44(6), 764–802. https://doi.org/10.1080/00273170903333665
McMillan, J., Beavis, A., & Jones, F. L. (2009). The AUSEI06: A new socioeconomic index for Australia. Journal of Sociology, 45(2), 123–149. https://doi.org/10.1177/1440783309103342
McConney, A., & Perry, L. B. (2010). Science and mathematics achievement in Australia: The role of school socioeconomic composition in educational equity and effectiveness. International Journal of Science and Mathematics Education 8, 429–452. https://doi.org/10.1007/s10763-010-9197-4
McConney, A., & Perry, L. B. (2010). Socioeconomic status, self-efficacy, and mathematics achievement in Australia: A secondary analysis. Educational Research for Policy and Practice 9, 77–91. https://doi.org/10.1007/s10671-010-9083-4
Muthén, B. O. (1991). Multilevel factor analysis of class and student achievement components. Journal of Educational Measurement, 28(4), 338–354. https://doi.org/10.1111/j.1745-3984.1991.tb00363.x
Muthén, L. K., & Muthén, B. O. (2017). Mplus user’s guide (8th ed.). Muthén & Muthén.
National Center for Education Statistics. (2012). Improving the measurement of socioeconomic status for the National Assessment of Educational Progress: A theoretical foundation. https://nces.ed.gov/nationsreportcard/pdf/researchcenter/Socioeconomic_Factors.pdf
Opdenakker, M.-C., & Damme, J. (2001). Relationship between school composition and characteristics of school process and their effect on mathematics achievement. British Educational Research Journal, 27(4), 407–432. https://doi.org/10.1080/01411920120071434
Organisation for Economic Co-operation and Development. (2003). Literacy skills for the world of tomorrow: Further results from Pisa 2000. OECD. https://doi.org/10.1787/9789264102873-en
Organisation for Economic Co-operation and Development. (2004). Learning for tomorrow’s world: First results from PISA 2003. OECD. https://doi.org/10.1787/9789264006416-en
Organisation for Economic Co-operation and Development. (2007). PISA 2006: Science competencies for tomorrow’s world: Volume 1: Analysis. OECD. https://doi.org/10.1787/19963777
Organisation for Economic Co-operation and Development. (2010). PISA 2009 results: Overcoming social background: Equity in learning opportunities and outcomes (Volume II). OECD. https://doi.org/10.1787/9789264091504-en
Organisation for Economic Co-operation and Development. (2013). PISA 2012 results: Excellence through equity (Volume II): Giving every student the chance to succeed. OECD. https://doi.org/10.1787/9789264201132-en
Organisation for Economic Co-operation and Development. (2016). PISA 2015 results (Volume I): Excellence and equity in education. OECD. https://doi.org/10.1787/9789264266490-en
Organisation for Economic Co-operation and Development. (2017). PISA 2015 technical report. OECD.
Organisation for Economic Co-operation and Development. (2018). Equity in education: Breaking down barriers to social mobility. OECD. https://doi.org/10.1787/9789264073234-en
Organisation for Economic Co-operation and Development. (2019). PISA 2018 results (Volume II): Where all students can succeed. OECD. https://doi.org/10.1787/b5fd1b8f-en
Organisation for Economic Co-operation and Development. (2020). PISA 2018 results (Volume V): Effective policies, successful schools. OECD. https://doi.org/10.1787/ca768d40-en
Palardy, G. J. (2008). Differential school effects among low, middle, and high social class composition schools: A multiple group, multilevel latent growth curve analysis. School Effectiveness and School Improvement, 19(1), 21–49. https://doi.org/10.1080/09243450801936845
Palardy, G. J. (2013). High school socioeconomic segregation and student attainment. American Educational Research Journal, 50(4), 714–754. https://doi.org/10.3102/0002831213481240
Palardy, G. J. (2015). High school socioeconomic composition and college choice: Multilevel mediation via organizational habitus, school practices, peer and staff attitudes. School Effectiveness and School Improvement, 26(3), 329–353. https://doi.org/10.1080/09243453.2014.965182
Peetsma, T., Van Der Veen, I., Koopman, P., & Van Schooten, E. (2006). Class composition influences on pupils’ cognitive development. School Effectiveness and School Improvement, 17(3), 275–302. https://doi.org/10.1080/13803610500480114
Perry, L. B., & McConney, A. (2010). Does the SES of the school matter? An examination of socioeconomic status and student achievement using PISA 2003. Teachers College Record, 112(4), 1137–1162. https://doi.org/10.1177/016146811011200401
Perry, L., & McConney, A. (2010). School socio-economic composition and student outcomes in Australia: Implications for educational policy. Australian Journal of Education, 54 (1), 72–85. https://doi.org/10.1177/000494411005400106
Perry, L. B., & McConney, A. (2013). School socioeconomic status and student outcomes in reading and mathematics: A comparison of Australia and Canada. Australian Journal of Education, 57 (2), 124–140. https://doi.org/10.1177/0004944113485836
Pokropek, A. (2015). Phantom effects in multilevel compositional analysis: Problems and solutions. Sociological Methods & Research, 44(4), 677–705. https://doi.org/10.1177/0049124114553801
Preacher, K. J., Zhang, Z., & Zyphur, M. J. (2011). Alternative methods for assessing mediation in multilevel data: The advantages of multilevel SEM. Structural Equation Modeling: A Multidisciplinary Journal, 18(2), 161–182. https://doi.org/10.1080/10705511.2011.557329
Preacher, K. J., Zyphur, M. J., & Zhang, Z. (2010). A general multilevel SEM framework for assessing multilevel mediation. Psychological Methods, 15(3), 209–233. https://doi.org/10.1037/a0020141
Rangvid, B. S. (2007). School composition effects in Denmark: Quantile regression evidence from PISA 2000. Empirical Economics, 33(2), 359–388. https://doi.org/10.1007/s00181-007-0133-6
Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed., Vol. 1). Sage.
Revelle, W. (2021). Package ‘psych’. The Comprehensive R Archive Network. https://cran.r-project.org/web/packages/psych/index.html
Rowe, E. (2020). Counting national school enrolment shares in Australia: The political arithmetic of declining public school enrolment. Australian Educational Researcher, 47, 517–535. https://doi.org/10.1007/s13384-019-00365-9
Rumberger, R. W., & Palardy, G. J. (2005). Does segregation still matter? The impact of student composition on academic achievement in high school. Teachers College Record, 107(9), 1999–2045.
Scheerens, J., Bosker, R. J., & Creemers, B. P. M. (2001). Time for self-criticism: On the viability of school effectiveness research. School Effectiveness and School Improvement, 12(1), 131–157. https://doi.org/10.1076/sesi.12.1.131.3464
Sciffer, M. G., Perry, L. B., & McConney, A. (2022). Does school socioeconomic composition matter more in some countries than others, and if so, why? Comparative Education, 58(1), 37–51. https://doi.org/10.1080/03050068.2021.2013045
Sirin, S. R. (2005). Socioeconomic status and academic achievement: A meta-analytic review of research. Review of Educational Research, 75(3), 417–453. https://doi.org/10.3102/00346543075003417
Smith, C., Parr, N., & Muhidin, S. (2018). Mapping schools’ NAPLAN results: A spatial inequality of school outcomes in Australia. Geographical Research, 57, 133–150. https://doi.org/10.1111/1745-5871.12317
Sofroniou, N., Archer, P., & Weir, S. (2004). An analysis of the association between socioeconomic context, gender, and achievement. The Irish Journal of Education/iris Eireannach an Oideachais, 35, 58–72.
Sui-Chu, E. H., & Willms, J. D. (1996). Effects of parental involvement on eighth-grade achievement. Sociology of Education, 69(2), 126–141. https://doi.org/10.2307/2112802
Televantou, I., Marsh, H. W., Kyriakides, L., Nagengast, B., Fletcher, J., & Malmberg, L.-E. (2015). Phantom effects in school composition research: Consequences of failure to control biases due to measurement error in traditional multilevel models. School Effectiveness and School Improvement, 26(1), 75–101. https://doi.org/10.1080/09243453.2013.871302
Thrupp, M., Lauder, H., & Robinson, T. (2002). School composition and peer effects. International Journal of Educational Research, 37(5), 483–504. https://doi.org/10.1016/s0883-0355(03)00016-8
Van Ewijk, R., & Sleegers, P. (2010). The effect of peer socioeconomic status on student achievement: A meta-analysis. Educational Research Review, 5(2), 134–150. https://doi.org/10.1016/j.edurev.2010.02.001
Warren, D. (2016). Parents’ choices of primary school. In K. Day (Ed.), The longitudinal study of Australian children annual statistical report 2015 (pp. 153–172). Australian Institute of Family Studies.
Willms, J. D. (2010). School composition and contextual effects on student outcomes. Teachers College Record, 112(4), 1008–1037.
Acknowledgements
Not applicable.
Funding
This research is supported by an Australian Government Research Training Program (RTP) Scholarship.
Author information
Authors and Affiliations
Contributions
MS developed and analysed the statistical models and drafted the manuscript text. LP read and commented on the analysis and manuscript text. AM read and commented on the analysis and manuscript text. All authors read and approved the final manuscript.
Authors’ information
Michael G. Sciffer is a Ph.D. student at Murdoch University. His research interests are school segregation, compositional effects, interactions between social contexts and school effectiveness and the appropriate specification of statistical models.
Laura B. Perry is a Professor of comparative education, sociology of education and education policy at Murdoch University. She conducts comparative research about educational disadvantage and inequalities, especially as they appear between schools and the systems, structures and policies that shape them. The aim of her research is to inform policy and practice for improving equity of educational opportunities, experiences and outcomes. Specific research interests include educational marketisation, school segregation and stratification, and social class and education.
Andrew McConney is an Honorary Research Fellow in educational research, evaluation and assessment at Murdoch University. Andrew’s interests include research on the effectiveness of teachers and teacher education; the evaluation of science, mathematics and other education-related programmes, typically using mixed-method approaches and, secondary analysis of large-scale international datasets to inform educational policy and practice.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
This research was granted exemption from human ethics approval by the Murdoch University Human Research Ethics Committee.
Consent for publication
Not applicable.
Competing interests
The authors declare that they have no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Additional file 1.
Appendix.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Sciffer, M.G., Perry, L.B. & McConney, A. The substantiveness of socioeconomic school compositional effects in Australia: measurement error and the relationship with academic composition. Large-scale Assess Educ 10, 21 (2022). https://doi.org/10.1186/s40536-022-00142-8
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s40536-022-00142-8