Skip to main content

An IERI – International Educational Research Institute Journal

Everything in moderation: ICT and reading performance of Dutch 15-year-olds

Abstract

Previous research on the relationship between students’ home and school Information and Communication Technology (ICT) resources and academic performance has shown ambiguous results. The availability of ICT resources at school has been found to be unrelated or negatively related to academic performance, whereas the availability of ICT resources at home has been found to be both positively and negatively related to academic performance. In addition, the frequency of use of ICT is related to students’ academic achievement. This relationship has been found to be negative for ICT use at school, however, for ICT use at home the literature on the relationship with academic performance is again ambiguous. In addition to ICT availability and ICT use, students’ attitudes towards ICT have also been found to play a role in student performance. In the present study, we examine how availability of ICT resources, students’ use of those resources (at school, outside school for schoolwork, outside school for leisure), and students’ attitudes toward ICT (interest in ICT, perceived ICT competence, perceived ICT autonomy) relate to individual differences in performance on a digital assessment of reading in one comprehensive model using the Dutch PISA 2015 sample of 5183 15-year-olds (49.2% male). Student gender and students’ economic, social, and cultural status accounted for a substantial part of the variation in digitally assessed reading performance. Controlling for these relationships, results indicated that students with moderate access to ICT resources, moderate use of ICT at school or outside school for schoolwork, and moderate interest in ICT had the highest digitally assessed reading performance. In contrast, students who reported moderate competence in ICT had the lowest digitally assessed reading performance. In addition, frequent use of ICT outside school for leisure was negatively related to digitally assessed reading performance, whereas perceived autonomy was positively related. Taken together, the findings suggest that excessive access to ICT resources, excessive use of ICT, and excessive interest in ICT is associated with lower digitally assessed reading performance.

Background

Over the last two decades, Information and Communication Technology (ICT) resources have become widely available to students, both at home and at school. Due to the importance of ICT in our modern society, a questionnaire obtaining information about ICT resources, ICT use (both at school and outside school), and ICT skills was included in the Programme for International Student Assessment (PISA; OECD 2009). In 2012, averaged across the OECD countries, 93% of students reported to use a computer at home and 72% of students reported to use a computer at school (OECD 2015a). Although countries have made considerable investments in ICT resources, internet connections, and educational software, there is little to no evidence that greater access to or use of ICT resources has led to an increase in mathematics, science, or reading scores (OECD 2015a). To gain further insight into the role of ICT in students’ performance in reading digital texts, the present study will explore how ICT resources, ICT use, and ICT attitudes are related to digitally assessed reading performance in Dutch adolescents, using the PISA 2015 database.

It is well documented that the availability and use of ICT are related to student background characteristics. Boys use computers and the internet more often than girls (Drabowicz 2014; Notten et al. 2009) and tend to use computers and the internet more often for entertainment than for school-related tasks (Tømte and Hatlevik 2011). According to the IEA International Computer and Information Literacy Study (ICILS 2013), boys and girls reported equal levels of basic ICT self-efficacy (e.g., searching for and finding files on the computer, creating or editing documents, uploading files to a digital platform) in most participating countries, including the Netherlands (Fraillon et al. 2014). Boys, however, reported higher levels of advanced ICT self-efficacy (e.g., building a website, changing computer settings, creating a computer programs or macro) as compared to girls in almost all countries, including the Netherlands (Fraillon et al. 2014). With respect to the Computer and Information Literacy scale, girls scored higher than boys in most countries (Fraillon et al. 2014), including the Netherlands (Meelissen et al. 2014). In addition, a higher socio-economic status was generally associated with higher Computer and Information Literacy proficiency (Fraillon et al. 2014). Students from high socio-economic backgrounds have greater access to ICT and report higher competence with regard to ICT (Zhong 2011). All of the following studies control for gender and economic, social, and cultural status (ESCS) in their analyses.

The literature on the relationship between the availability of ICT resources and academic achievement is ambiguous (e.g., Angrist and Lavy 2002; Fuchs and Wöβmann 2005; Lee and Wu 2012), partly because the relationship is different for ICT resources at home and ICT resources at school. With regard to availability of ICT resources at school, several studies have found that increased availability of ICT was not associated with improved reading performance (Angrist and Lavy 2002; Goolsbee and Guryan 2006; Rouse et al. 2004). Lee and Wu (2012) showed neither a direct nor an indirect relation with reading performance in their study of engagement in online reading activities and PISA 2009 reading literacy results. Similarly, Fuchs and Wöβmann (2005), using the PISA 2000 data, were unable to find evidence of a relationship between student achievement and the availability of computers at school. In one longitudinal study evaluating the impact of a nationwide computer subsidy for Dutch schools with high percentages of students from a disadvantaged group, it was shown that an increase in computers and software had a negative impact on students’ language and arithmetic scores (Leuven et al. 2007). Students’ achievement after receiving funds for computers and software was lower as compared to achievement before receiving funds.

Concerning the relationship between academic achievement and availability of resources at home, Notten and Kraaykamp (2009) found that the number of computers at home was positively related to PISA scores. In contrast, Hu et al. (2018) showed that access to ICT at home was negatively related to PISA achievement. Fuchs and Wöβmann (2005) also found evidence of a negative relation between student achievement and availability of computers at home. In examining reading performance, Lee and Wu (2012) showed that access to various ICT items at home was negatively associated with PISA reading literacy, however, they also found evidence of a positive indirect relation between ICT resources at home and academic achievement through online reading engagement. They found that when students read online materials in a meaningful way (both spontaneously and directed), take advantage of online resources, or participate in online discussion forums, having access to various ICT items at home is positively associated to PISA reading literacy (Lee and Wu 2012). Taken together, these results suggest that the relation between availability of ICT at home and reading performance is ambiguous, with some studies showing a positive relationship and others showing a negative relationship.

In addition to the availability of computers, the frequency of use both at school and at home might play a role in student reading performance. In examining the PISA 2015 data of 44 countries including the Netherlands, Hu et al. (2018) found evidence of a negative association between ICT use at school (e.g., for sending emails or browsing the internet for schoolwork) and reading performance. These results indicate that students who use ICT resources more often for academically related work, tend to have lower reading scores. These results are in line with the associations found between ICT use at school and reading performance in PISA 2012 (Petko et al. 2017). As in 2015, the 2012 data shows that ICT use at school is negatively related to reading achievement. With respect to ICT use at school, it is suggested that a narrow set of learning areas are affected by computer programs (Skryabin et al. 2015) and that teachers only use ICT resources for a narrow set of pedagogical purposes, without changing the ways of teaching they already used (Ertmer and Ottenbreit-Leftwich 2010, 2013). Finally, it has also been suggested that in schools, ICT is often used in a remedial fashion by lower performing students and students with special educational needs (Gilleece and Eivers 2018; Zhang et al. 2016). Finally, Papanastasiou et al. (2005) suggested that the association between ICT use (in general) and academic achievement might not be linear but might follow an inverted U-shape, since excessive use might distract students from their schoolwork.

With regard to the association between ICT use at home and reading performance, results are also ambiguous. This might be due to the different types of activities that students use ICT for at home. Lee and Wu (2012) showed that, as long as students use ICT resources for online reading related activities (such as reading news, using wikis or online encyclopedias, and participating in online discussion forums), the availability of ICT resources at home has a positive impact on reading performance. A positive relationship with academic achievement was also found when the computer at home was used for education and communication (Fuchs and Wöβmann 2005). Whereas Petko et al. (2017) also found that ICT use at home for schoolwork (browsing the internet for schoolwork, using email for communication with other students about schoolwork, and doing homework on a computer) was positively associated with reading achievement, Hu et al. (2018) showed a negative association between ICT use at home for schoolwork and reading performance. One reason for the mixed results might be found in the way ICT use at home for schoolwork was measured in both studies. In the study based on PISA 2012 (Petko et al. 2017) only seven items were used to measure this construct; in the study based on PISA 2015 (Hu et al. 2018) this increased to 12 items. As also pointed out by Hu et al. (2018) “the effect of ICT academic use outside school becomes inconclusive when different studies employ different ICT activities to produce the overall indicator of ICT use activities at school” (p. 9).

In addition to the use of ICT at home for schoolwork, students might also use ICT at home for leisure activities. Surprisingly, the study by Biagi and Loi (2013) showed a positive relationship between the use of ICT for gaming and PISA 2009 scores in most countries. In line with this result, Hu et al. (2018) found evidence that students who use ICT resources more often for leisure activities (e.g., playing online games, chatting online, reading news on the internet, and downloading new apps on a mobile device) tended to perform better on reading tests. Again, the study by Petko et al. (2017) found different results, showing a negative association between ICT use at home for leisure and reading performance. Similarly, Luu and Freeman (2011) showed that the use of productivity and entertainment software was negatively associated with academic achievement.

In addition to the frequency of use and the type of activities undertaken, there might be a relationship with students’ attitudes towards ICT and academic achievement. Luu and Freeman (2011), for example, showed that students with high confidence in ICT tasks had higher science scores than students with low confidence in ICT tasks. Similarly, reading scores are found to be higher for students who think working with computers is important, fun, or interesting and for students who feel confident in using ICT (Lee and Wu 2012). These relations, however, were mediated by frequency of engagement in online reading activities, indicating that part of the relation can be explained by the fact that students with more positive attitudes and higher confidence engage in online reading activities more often.

The role of ICT in digital reading

Most information provided through ICT resources is in text or in a written form. One might therefore assume that any relationship between academic performance and ICT access, ICT use, and attitudes towards ICT would be particularly visible for assessing performance in reading digital texts (Anil and Ozer 2012). Naumann and Sälzar (2017) explored digital reading proficiency in German 15-year-olds using data on the computer-based PISA assessment of reading. In this assessment, the paper-based text and items are transformed to a digital environment (OECD 2017). Though it was a digital assessment, the instrument contained the same (trend) items as the paper-based assessment. That is, linear texts were displayed on a computer screen and students answered questions about the texts on the screen as well. Naumann and Sälzer showed that moderate levels of computer use, both at school and at home, were related to high digitally assessed reading performance. Both low and high levels of computer use were related to lower digitally assessed reading performance, suggesting that the relationship between computer use and digitally assessed reading performance also follows an inverted U-shaped curve. They also showed that positive attitudes were associated with higher digitally assessed reading performance, whereas negative attitudes were associated with lower digitally assessed reading performance.

The IEA Progress in International Reading Literacy Study (PIRLS) 2016 was also extended with an assessment of online reading (ePIRLS). In contrast with the digital reading assessment of PISA, however, fourth-grade students participating in ePIRLS were provided with an assessment of online reading which consisted of a simulated internet environment (Mullis et al. 2015). In addition, students were able to navigate through pages with a variety of features (e.g., graphics, multiple tabs, links, pop-up windows, animation). Bivariate results from this study show that students with high access to digital devices in the home generally have higher online reading scores than students with medium access (Mullis et al. 2017). In addition, a study by Gilleece and Eivers (2018) showed that, after controlling for student background variables, Irish students with an internet connection at home have higher reading scores both on paper-and-pencil PIRLS and ePIRLS than students without an internet connection. Gilleece and Eivers (2018) also showed that students who use a computer at home for schoolwork less frequently have higher reading achievement in ePIRLS. Ultimaltely, Irish ePIRLS results showed that students with high enjoyment of reading had higher online reading scores than students with low enjoyment of reading (Gilleece and Eivers 2018), which is in line with results on paper-based reading (e.g., Lee and Wu 2012; Luu and Freeman 2011).

The present study

Data show that the Netherlands, together with Scandinavian countries, belongs to the group of countries with the greatest availability of ICT resources at home and the greatest integration of ICT in both primary (Eurydice 2012) and secondary schools (OECD 2015a). Although the Netherlands has high levels of availability and use of ICT both at home and at school, Dutch teachers spend little time teaching ICT-related skills in comparison to other countries (Fraillon et al. 2014; Meelissen et al. 2014).

In the present study, we aim to answer the following research question: What is the role of ICT resources, use, and attitudes in predicting Dutch PISA 2015 digitally assessed reading performance? We examine the relationships between ICT resources, ICT use, attitudes towards ICT, and digitally assessed reading performance in one comprehensive model using the Dutch PISA 2015 data. With a digital assessment of reading performance and ample information on students’ access to ICT resources, their ICT use, and their attitudes towards ICT, PISA 2015 data provided a unique possibility to examine the role of ICT in Dutch 15-year-old’s digitally assessed reading performance. In addition, the multivariate analysis method used allows to examine the effect of the various predictors in one comprehensive model.

Methods

Participants

The PISA study applies a two-stage stratified sampling strategy. In the first stage of sampling, a sample of 203 Dutch schools with 15-year-olds was drawn by Westat. These schools were sampled systematically from a comprehensive national list of all PISA eligible schools, with probabilities that were proportional to the estimated number of 15-year-old students enrolled in the school. For a comprehensive elucidation on PISA sampling and survey methods, see the international report (OECD 2015b).

Sampled schools were invited to participate via a letter. When a school from the original sample declined participation, a replacement school was invited to participate. This resulted in an overall sample of 187 Dutch schools. As a second stage of sampling, students within these schools were sampled. The Target Cluster Size (TCS) for the Netherlands was set to 35, which means that a sample of 35 students was selected with equal probability within each school. For schools with fewer than 35 PISA eligible students, all students were selected. The sample consisted of 5385 students, yet due to some missing values in the ICT questionnaire (n = 202), the eventual sample for the present study consists of 5183 students (49.2% male) with a mean age of 15 years and 9 months (SD = 3.5 months).

Measures

The PISA 2015 digital assessment of reading performance comprised only trend items. The availability of ICT resources was addressed in the general student questionnaire. Data regarding students’ ICT use and ICT attitude was retrieved from the computer familiarity questionnaire, which asked students to further evaluate different aspects related to digital media and digital devices. These devices included desktop computers, portable laptops, notebooks, smartphones, tablets, cell phones without internet access, game consoles, and internet-connected television. In order to examine whether individual items could be combined into the original international scales, a Confirmatory Factor Analyses (CFA) with the Dutch ICT data was carried out. Results indicated good model fit (χ2 (1409) = 2678.52, p < .001, RMSEA = .026, CFI = .984, NFI = .980, GFI = .992, AGFI = .990; Hu and Bentler 1999), suggesting that the dimensionality of the Dutch ICT data corresponds with the total PISA data. The CFA was carried out using the open-source statistical program RStudio (RStudio Team 2016) with the lavaan.survey function from the lavaan.survey package (Oberski 2014) and the svrepdesign function from the survey package (Lumley 2017). For an overview of all cognitive items and items of the computer familiarity questionnaire, see the PISA 2015 assessment and analytical framework (OECD 2017).

Digitally assessed reading performance

The PISA 2015 reading assessment was originally developed for the PISA 2000 cycle through a consensus building process involving reading experts from various countries. In PISA, reading is defined as “understanding, using, reflecting on, and engaging with written texts, in order to achieve one’s goals, to develop one’s knowledge and potential, and to participate in society” (OECD 2017, p. 49). The reading assessment includes two text formats: (1) continuous and (2) non-continuous. Continuous texts are composed of sentences that are organized into paragraphs or larger structures such as sections, chapters, or books. Non-continuous texts, on the other hand, are organized in matrix format. Texts also vary in type, including descriptive texts, narratives, expositions, arguments, instruction, and transaction (see OECD 2017, p. 53 for more information on text formats and text types).

Reading items address three broad interrelated and interdependent reading processes: (1) access and retrieval, (2) integration and interpretation, and (3) reflection and evaluation. The difficulty levels of items range from very straightforward to quite sophisticated reading comprehension activities. Response formats are either multiple-choice or short constructed-response. The multiple-choice items were computer-coded. Constructed response items were judged by a team of coders according to the international coding guidelines. Incorrect answers were coded as no credit answers (score 0). Correct answers to simple questions were provided with full credit (score 1). Correct yet incomplete answers to complex questions were provided with partial credit (score 1), and correct and complete answers to complex questions with full credit (score 2).

Because each student only completed part of the PISA assessment in the cognitive domain, PISA does not provide a single measure for reading comprehension, yet provides ten “plausible values” for each student, based on item response theory. As all PISA proficiency scales, the mean of the reading scale was set to 500, with a standard deviation of 100.

ICT resources

The availability of ICT resources at home was assessed with six items. Students first indicated whether there was educational software and an internet connection at their home (1 = yes; 2 = no). Next, students indicated on a four-point Likert scale (1 = none; 2 = one; 3 = two; 4 = three or more) how many cell phones with internet access, computers, tablet computers, and e-book readers were present in their home. Correlations between the items were all positively significant, indicating that students with more computers also had more e-books in their home. As indicators of individual participant scores, the PISA Weighted Likelihood Estimates scale scores were used.

ICT use

With regard to students’ use of ICT, both their use at school and their use outside school were assessed. Nine items addressed their use of ICT at school in general, for example to send e-mails or to browse the internet for schoolwork. Items regarding the use of ICT outside school were divided over two subscales: use of ICT outside school for leisure and use of ICT outside school for schoolwork. Both subscales consisted of twelve activities, for which students indicated on a five-point Likert scale (1 = never or hardly ever; 2 = once or twice a month; 3 = once or twice a week; 4 = almost every day; 5 = every day) how often they made use of digital devices to perform the activity. In the analyses, the Weighted Likelihood Estimates were used for all three scales of ICT use.

ICT attitude

The computer familiarity questionnaire tapped into three aspects of ICT attitude: interest, competence, and autonomy. Students’ ICT interest was assessed with six items. Students’ perceived ICT competence and students’ perceived ICT autonomy related to ICT use were both assessed with five items. All items were answered on a four point-Likert scale (1 = strongly disagree; 2 = disagree; 3 = agree; 4 = strongly agree). Again, we used Weighted Likelihood Estimates as indicator of ICT attitudes.

Procedure

After a short introduction, students first completed the cognitive items, including the items for reading. After a short break, students filled in the items of the student background and computer familiarity questionnaires. The assessment of all cognitive items and background questionnaires was computer-based and took 5 h, including breaks. Students with special educational needs (n = 127) completed the shorter, ‘Une Heure’, version of the cognitive instrument. The testing procedure was in accordance with the PISA standardized protocol.

Data analysis

To deal with the PISA two-stage stratified sampling scheme and following standardized PISA procedures, we included replicate weights in all analyses. Furthermore, we used plausible values to approximate the students’ true scores. First, we estimated population parameters on digitally assessed reading performance and the scales regarding ICT resources, ICT use, and ICT attitude using the IEA IDB analyzer. Second, correlations between all measures were estimated using the IEA IDB analyzer. Third, the relationship between ICT resources, ICT use, and ICT attitudes and digitally assessed reading performance was examined with a hierarchical regression model using RStudio with the pisa2015.reg.pv function from the intsvy package (Caro and Biecek 2017). In Step 1 of the regression model, we included gender and economic, social, and cultural status (ESCS), since these variables are two well-known predictors of reading performance (Chiu and McBride-Chang 2006; Luu and Freeman 2011). In the second step both the linear and quadratic effects of ICT resources, ICT use (at school in general, outside school for schoolwork, outside school for leisure), and ICT attitudes (interest, perceived competence, perceived autonomy) were added to the model to examine the impact of the quadratic effects over and above the impact of the linear effects. Quadratic effects were calculated by multiplying the predictor by itself.

Results

Inferential statistics

Univariate population parameters on digitally assessed reading performance and all ICT measures are presented in Table 1. The statistics show that the mean digitally assessed reading performance of Dutch students was higher (t (511876) = 35.63, p < .001) than the international average of all countries participating in PISA 2015. The mean score for availability of ICT resources in the Netherlands is higher than the availability of resources in countries participating in the ICT familiarity questionnaire (t (334197) = 70.75, p < .001). Moreover, positive scores for students’ use of ICT at school in general and outside school for schoolwork indicate that their use is also higher than the international average (t (334197) = 33.20, p < .001 and t (334197) = 7.39, p < .001). Students’ use of ICT outside school for leisure, on the other hand, is lower than the international average (t (334197) = 11.14, p < .001). Results furthermore show that levels of interest (t (334197) = 4.34, p < .001) are higher than the international average, whereas perceived competence (t (334197) = 1.49, p = .136) and perceived autonomy (t (334197) = .73, p = .014) are comparable to the international averages.

Table 1 Means and SD for digitally assessed reading performance, ICT resources, ICT use, and ICT attitudes

Correlations

Correlations between digitally assessed reading performance and all ICT measures are presented in Table 2. All correlations, with the exception of the correlations between digitally assessed reading performance and ICT resources and ICT use outside school for schoolwork, were significant. The relationship between digitally assessed reading performance and both ICT use at school in general and outside school for leisure are negative, indicating that students who spend more time using ICT at school in general or outside school for leisure have lower reading scores. The positive correlations for ICT attitudes and digitally assessed reading performance indicate that higher levels of interest, perceived competence, and perceived autonomy are related to higher digitally assessed reading performance.

Table 2 Correlations between digitally assessed reading performance, ICT resources, ICT use, and ICT attitudes

Results furthermore show that the availability of resources is positively related to students’ ICT use and ICT attitude. In addition, the three indicators of ICT use are moderately correlated, as are the three indicators of ICT attitudes. The relationship with ICT attitudes is higher for the use of ICT outside school for leisure than for the use outside school for schoolwork or at school in general.

Regression

Unstandardized and standardized regression coefficients are presented in Table 3. Intercepts are also included (Field 2009, p. 252). Results of Step 1 show that students’ gender and ESCS account for 13% of the variance in digitally assessed reading performance. The negative regression coefficient for gender indicates that girls had higher digitally assessed reading scores than boys, while taking ESCS into account. The positive regression coefficient for ESCS indicates that students from families with higher economic, social, and cultural backgrounds had higher reading scores than students from families with lower economic, social, and cultural backgrounds after controlling for student gender.

Table 3 Standardized and unstandardized coefficients of ICT resources, use, and attitudes on digitally assessed reading performance

Results of Step 2 show that the ICT related variables account for an extra 9% of the variance in digitally assessed reading performance. The quadratic effect of the availability of ICT resources was significantly related to digitally assessed reading performance over and above the control variables entered at Step 1, the linear effect of ICT resources and the other ICT variables entered in Step 2. The negative sign of the quadratic effect indicates that although students’ digitally assessed reading performance increased with an increasing number of ICT resources, the relationship turned negative with a larger availability of resources. Figure 1 shows the estimated conditional relationship between ICT resources and digitally assessed reading performance based on the model and therefore correcting for the other variables in the model.

Fig. 1
figure 1

Relationship between the availability of ICT resources and digitally assessed reading performance

The quadratic effects of ICT use at school in general and outside school for schoolwork were significantly related to digitally assessed reading performance over and above the control variables entered at Step 1, the linear effect of ICT use in general and outside school for schoolwork and the other ICT variables entered in Step 2. The negative signs of the quadratic effects indicate that although students’ digitally assessed reading performance increased with increasing ICT use both at school in general and at home for schoolwork, these relationships turned negative with excessive use of ICT. The quadratic effect for ICT use outside school for leisure is not significantly related to digitally assessed reading performance. The linear effect, however, is significantly related over and above the control variables. The negative sign indicates that the more students used ICT outside school to play games or to participate in social networks, for example, the lower their digitally assessed reading performance. Figure 2 shows the estimated relationships between ICT use at school in general, at home for schoolwork, and at home for leisure and digitally assessed reading performance based on the model and therefore correcting for the other variables in the model.

Fig. 2
figure 2

Relationship between the use of ICT and digitally assessed reading performance

The quadratic effects of interest in ICT and perceived ICT competence were significantly related to digitally assessed reading performance over and above the control variables entered in Step 1, the linear effects of interest in ICT and perceived ICT competence, and the other ICT variables entered in Step 2. The negative sign of the quadratic term of interest in ICT indicates that although students’ digitally assessed reading performance increased with more interest in ICT, the relationship turned negative with excessive interest in ICT. The positive sign of the quadratic term of perceived ICT competence indicates that although students’ digitally assessed reading performance decreased with more perceived ICT competence, the relationship turned positive with excessive perceived ICT competence. The quadratic effect for perceived ICT autonomy is not significant. The linear effect, however, is significant over and above the control variables entered at Step 1 and the other ICT variables entered in Step 2. The positive sign indicates that the more ICT autonomy students perceived, the higher their digitally assessed reading performance. Figure 3 shows the estimated relationships between interest in ICT, perceived ICT competence, and perceived ICT autonomy and digitally assessed reading performance based on the model and therefore correcting for the other variables in the model. The model including the controls and ICT related variables explained 22% of the variance in digitally assessed reading performance.

Fig. 3
figure 3

Relationship between interest in ICT, perceived ICT competence, and autonomy and digitally assessed reading performance

Discussion

Previous research has shown that the mere presence of ICT resources does not have a positive relationship with digitally assessed reading performance and that the way in which ICT resources are used and attitudes towards ICT play a role. International large-scale assessment studies have shown that the availability of ICT resources at home in the Netherlands is high, in comparison to the availability in other countries (OECD 2015a). In addition, teachers in both primary and secondary education make frequent use of ICT resources at school (Eurydice 2012; OECD 2015a). In the present study, we investigated the role of ICT resources, ICT use, and ICT attitudes in digitally assessed reading performance of Dutch 15-year-olds. Significant relationships for all variables were found and results are discussed in more detail below.

First, results showed that student backgrounds accounted for a substantial part of the variation in reading performance. In line with other studies (e.g., Fraillon et al. 2014; Luu and Freeman 2011), a strong positive relationship with digitally assessed reading performance was found for students’ ESCS. Students from families with higher economic, social, and cultural backgrounds had higher reading scores than students from families with lower economic, social, and cultural backgrounds. In addition, boys had lower digitally assessed reading scores than girls, controlling for ESCS.

Since both ESCS and gender significantly predicted digitally assessed reading performance, they were used as control variables in Step 2. The full model presented in Table 3 shows a negative quadratic relationship indicating that having access to computers, educational software, and internet at home is negatively related to digitally assessed reading performance only after a certain threshold has been reached. This result is in line with results by Naumann and Sälzar (2017), who also found a negative quadratic relationship between ICT resources at home and digitally assessed reading performance. Our findings thus suggest that, after taking gender and ESCS into account, students who reported to have access to a multitude of devices and connections at home had lower scores on digitally assessed reading as compared to students with moderate levels of ICT resources.

With respect to ICT use of Dutch students, results showed that the highest digitally assessed reading performance was associated with moderate use at school and outside school for schoolwork. That is, these relationships were found to be negatively quadratic (inverted U-shape), indicating that moderate but not excessive use of ICT was associated with the highest digitally assessed reading performance. This U-shaped relationship was also found for PISA countries in general (OECD 2015a). In the home context, this relationship has also been elucidated in previous studies. Agasisti et al. (2017) likewise found that greater use of ICT at home, explicitly in relation to school-related tasks, was negatively associated with students’ test scores. This relationship may, in part, be explained by inefficiency in using ICT resources. Students may be using ICT more frequently at home because instructions are unclear or because the computer is too slow and software does not meet students’ needs, in which case greater ICT use will not necessarily benefit online reading performance (Agasisti et al. 2017). In the school context, the negative relationship between more frequent use of ICT at school and digitally assessed reading performance, according to Naumann and Sälzar (2017), may be explained by the fact that ICT can also be used in a remedial fashion. Similarly, both Gilleece and Eivers (2018) and Zhang et al. (2016) have found that struggling students are often assigned to work on computers to remediate learning problems, which might explain the negative association with excessive use of ICT in school and digital reading performance in the present study. With regard to use of ICT for leisure, frequent use was found to relate negatively to digitally assessed reading performance: The more students use ICT for leisure, the lower their digitally assessed reading performance. Steffens (2014) also showed that the use of a video games console is negatively related to PISA achievement. According to Papanastasiou et al. (2005), the use of ICT outside school for leisure might distract students from their schoolwork.

In addition to the availability and use of ICT, attitudes toward ICT were also found to be related to digitally assessed reading performance. Similar to results by Hu et al. (2018), results of the present study show that the relationships between attitude towards ICT and online reading performance are complex. We found that high perceived autonomy of ICT is related to high digitally assessed reading performance. The relationship between interest in ICT and digitally assessed reading performance, however, follows an inverted U-shape. This means that both a lack of interest in ICT and excessive interest in ICT are related to low digitally assessed reading performance. Students with moderate interest in ICT performed best in digitally assessed reading. With respect to perceived competence, students that reported moderate competence in ICT had the lowest digitally assessed reading performance, as compared to students that reported low and high perceived ICT competence. The difference in digitally assessed reading performance seems to be larger between students with moderate and high perceived ICT competence than between students with moderate and low perceived ICT competence. It might be that students can only benefit from an increase in competence in ICT when a certain threshold is reached.

The present study has several limitations. First, PISA is a cross-sectional study. Data were thus gathered at only one time point and not longitudinally, which means that causality cannot be inferred. The relationship between ICT resources, ICT use, and ICT attitudes and digitally assessed reading performance might be explained by another factor not included in the present analyses. In addition, relationships between digitally assessed reading performance and the different aspects of ICT use and attitude might also be reciprocal. Second, although the assessment was computer-based, texts were still mostly linear texts. Online texts often are non-linear texts (with hyperlinks). Future research, for example the ePIRLS study in 2021, might examine whether the relationships between ICT resources, ICT use, and ICT attitudes are stronger when reading these non-linear texts. Moreover, future research could also examine online process and behavioural data (i.e., log-files) to gain more insight into the timing and type of actions students perform while engaging in digital reading tasks.

Conclusion

To conclude, we have shown that moderate access to and use of ICT resources is positively related to digitally assessed reading performance of Dutch students, whereas excessive access to and use of ICT is negatively associated with Dutch students’ performance in digitally assessed reading. Similarly, an excessive interest in ICT is also negatively associated with students’ performance in digitally assessed reading. These results suggest that simply investing money and time to provide students with ICT resources at home or school and to increase their use of these resources, does not necessarily enhance digitally assessed reading performance. Therefore, researchers and Dutch policy makers should focus on what type of activities would benefit efficient use of ICT both at home and at school and therewith the enhancement of (digital) reading comprehension.

Availability of data and materials

The datasets analysed during the current study are available in the PISA database, http://www.oecd.org/pisa/data/2015database/.

Abbreviations

AGFI:

adjusted goodness of fit index

CFI:

comparative fit index

CIL:

Computer and Information Literacy

ESCS:

economic, social and cultural status

GFI:

goodness of fit index

ICT:

Information and Communication Technology

NFI:

normed fit index

OECD:

Organisation for Economic Co-operation and Development

PISA:

Programme for International Student Assessment

RMSEA:

root mean square error of approximation

TCS:

target cluster size

References

  • Agasisti, T., Gil-Izquierdo, M., & Han, S. W. (2017). ICT use at home for school-related tasks: What is the effect on a student’s achievement? Empirical evidence from OECD PISA data. MPRA Paper 81343. Munich, Germany: University Library of Munich.

  • Angrist, J., & Lavy, V. (2002). New evidence on classroom computers and pupil learning. Economic Journal,112, 735–765. https://doi.org/10.3386/w7424.

    Article  Google Scholar 

  • Anil, D., & Ozer, Y. (2012). The effect of the aim and frequency of computer usage on student achievement according to PISA 2006. Procedia-Social and Behavioral Sciences,46, 5484–5488. https://doi.org/10.1016/j.sbspro.2012.06.462.

    Article  Google Scholar 

  • Biagi, F., & Loi, M. (2013). Measuring ICT use and learning outcomes: Evidence from recent econometric studies. European Journal of Education,48, 28–42. https://doi.org/10.1111/ejed.12016.

    Article  Google Scholar 

  • Caro, D. H., & Biecek, P. (2017). intsvy: International Assessment Data Manager. R package version,2, 1.

    Article  Google Scholar 

  • Chiu, M. M., & McBride-Chang, C. (2006). Gender, context, and reading: A comparison of students in 43 countries. Scientific Studies of Reading,10, 331–362. https://doi.org/10.1207/s1532799xssr1004_1.

    Article  Google Scholar 

  • Drabowicz, T. (2014). Gender and digital usage inequality among adolescents: A comparative study of 39 countries. Computers & Education,74, 98–111. https://doi.org/10.1016/j.compedu.2014.01.016.

    Article  Google Scholar 

  • Ertmer, P. A., & Ottenbreit-Leftwich, A. T. (2010). Teacher technology change: How knowledge, confidence, beliefs, and culture intersect. Journal of Research on Technology in Education,42, 255–284. https://doi.org/10.1080/15391523.2010.10782551.

    Article  Google Scholar 

  • Ertmer, P. A., & Ottenbreit-Leftwich, A. (2013). Removing obstacles to the pedagogical changes required by Jonassen’s vision of authentic technology-enabled learning. Computers & Education,64, 175–182. https://doi.org/10.1016/j.compedu.2012.10.008.

    Article  Google Scholar 

  • Eurydice (2012). Key Data on Learning and Innovation Through ICT at School in Europe 2011. https://doi.org/10.2797/61068.

  • Field, A. (2009). Discovering Statistics Using SPSS (3rd ed.). London: SAGE Publications Ltd.

    Google Scholar 

  • Fraillon, J., Ainley, J., Schulz, W., Friedman, T., & Gebhardt, E. (2014). Preparing for life in a digital age: The IEA International Computer and Information Literacy Study International Report. Amsterdam, the Netherlands: International Association for the Evaluation of Educational Achievement (IEA).

  • Fuchs, T., & Wöβmann, L. (2005). Computers and Student Learning: Bivariate and multivariate evidence on availability and use of computers at home and at schools. Munich: IFO Working Paper, No. 8.

  • Gilleece, L., & Eivers, E. (2018). Characteristics associated with paper-based and online reading in Ireland: Findings from PIRLS and ePIRLS 2016. International Journal of Educational Research,91, 16–27. https://doi.org/10.1016/j.ijer.2018.07.004.

    Article  Google Scholar 

  • Goolsbee, A., & Guryan, J. (2006). The impact of Internet subsidies in public schools. The Review of Economics and Statistics,88, 336–347. https://doi.org/10.3386/w9090.

    Article  Google Scholar 

  • Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal,6, 1–55. https://doi.org/10.1080/10705519909540118.

    Article  Google Scholar 

  • Hu, X., Gong, Y., Lai, C., & Leung, F. K. S. (2018). The relationship between ICT and student literacy in mathematics, reading, and science across 44 countries: A multilevel analysis. Computers & Education,125, 1–13. https://doi.org/10.1016/j.compedu.2018.05.021.

    Article  Google Scholar 

  • Lee, Y.-H., & Wu, J.-Y. (2012). The effect of individual differences in the inner and outer states of ICT on engagement in online reading activities and PISA 2009 reading literacy: Exploring the relationship between the old and new reading literacy. Learning and Individual Differences,22, 336–342. https://doi.org/10.1016/j.lindif.2012.01.007.

    Article  Google Scholar 

  • Leuven, E. M., Lindahl, M., Oosterbeek, H., & Webbink, D. (2007). The effect of extra funding for disadvantaged pupils on achievement. The Review of Economics and Statistics,89, 721–736.

    Article  Google Scholar 

  • Lumley, T. (2017). survey: Analysis of complex survey samples. R package version,3, 32.

    Google Scholar 

  • Luu, K., & Freeman, J. G. (2011). An analysis of the relationship between information and communication technology (ICT) and scientific literacy in Canada and Austria. Computers & Education,56, 1072–1082. https://doi.org/10.1016/j.compedu.2010.11.008.

    Article  Google Scholar 

  • Meelissen, M. R. M., Punter, R. A., & Drent, M. (2014). Digitale geletterdheid van leerlingen in het tweede leerjaar van het voortgezet onderwijs: Nederlandse resultaten van ICILS-2013 [Digital literacy of students in the second year of secondary education: Dutch results of ICILS-2013]. Enschede: University of Twente.

    Google Scholar 

  • Mullis, I. V. S., Martin, M. O., Foy, P., & Hooper, M. (2017). ePIRLS 2016 International Results in Online Informational Reading. Retrieved from Boston College, TIMSS & PIRLS International Study Center website http://timssandpirls.bc.edu/epirls2016/international-results/.

  • Mullis, I. V. S., Martin, M. O., & Sainsbury, M. (2015). PIRLS 2016 reading framework. In I. V. S. Mullis & M. O. Martin (Eds.), PIRLS 2016 Assessment Framework (2nd ed., pp. 11–30). Chestnut Hill: TIMSS & PIRLS International Study Center, Boston College.

    Google Scholar 

  • Naumann, J., & Sälzar, C. (2017). Digital reading proficiency in German 15-year-olds: Evidence from PISA 2012. Zeitschrift für Erziehungswissenschaft,20, 585–603. https://doi.org/10.1007/s11618-017-1758-y.

    Article  Google Scholar 

  • Notten, N., & Kraaykamp, G. (2009). Home media and science performance: A cross national study. Educational Research and Evaluation,15, 367–384. https://doi.org/10.1080/13803610903087045.

    Article  Google Scholar 

  • Notten, N., Peter, J., Kraaykamp, G., & Valkenburg, P. M. (2009). Research Note: Digital divide across borders—A cross-national study of adolescents’ use of digital technologies. European Sociological Review,25, 551–560.

    Article  Google Scholar 

  • Oberski, D. L. (2014). lavaan.survey: An R Package for Complex Survey Analysis of Structural Equation Models. Journal of Statistical Software, 57, 1-27. http://www.jstatsoft.org/v57/i01/.

  • OECD. (2009). PISA 2009 Assessment Framework: Key Competences in Reading, Mathematics and Science. Paris: OECD Publishing.

    Google Scholar 

  • OECD. (2015a). Students, Computers and Learning: Making the Connection. Paris: OECD Publishing. https://doi.org/10.1787/9789264239555-en.

    Book  Google Scholar 

  • OECD (2015b). Sampling design. In PISA 2015 Technical report. Paris: OECD Publishing.

  • OECD. (2017). PISA 2015 Assessment and Analytical Framework: Science, Reading, Mathematic, Financial Literacy and Collaborative Problem Solving. Paris: OECD Publishing. https://doi.org/10.1787/9789264281820-en.

    Book  Google Scholar 

  • Papanastasiou, E. C., Zembylas, M., & Vrasidas, C. (2005). An examination of the PISA database to explore the relationship between computer use and science achievement. Educational Research and Evaluation,11, 529–543. https://doi.org/10.1080/13803610500254824.

    Article  Google Scholar 

  • Petko, D., Cantieni, A., & Prasse, D. (2017). Perceived quality of educational technology matters a secondary analysis of students’ ICT use, ICT-related attitudes, and PISA 2012 test scores. Journal of Educational Computing Research,54, 1070–1091. https://doi.org/10.1177/0735633116649373.

    Article  Google Scholar 

  • Rouse, C., Krueger, A., & Markman, L. (2004). Putting computerised instruction to the test: a randomized evaluation of a ‘scientifically-based’ reading program. Economics of Education Review,23, 323–338. https://doi.org/10.1016/j.econedurev.2003.10.005.

    Article  Google Scholar 

  • RStudio Team (2016). RStudio: Integrated Development Environment for R. RStudio, Inc., Boston, MA. http://www.rstudio.com.

  • Skryabin, M., Zhang, J., Liu, L., & Zhang, D. (2015). How the ICT development level and usage influence student achievement in reading, mathematics, and science? Computers & Education,85, 49–58. https://doi.org/10.1016/j.compedu.2015.02.004.

    Article  Google Scholar 

  • Steffens, K. (2014). ICT use and achievement in three European countries: what does PISA tell us? European Educational Research Journal,13, 553–562. https://doi.org/10.2304/eerj.2014.13.5.553.

    Article  Google Scholar 

  • Tømte, C., & Hatlevik, O. E. (2011). Gender-differences in self-efficacy ICT related to various ICT-user profiles in Finland and Norway. How do self-efficacy, gender and ICTuser profiles relate to findings from PISA 2006. Computers & Education,57, 1416–1424. https://doi.org/10.1016/j.compedu.2010.12.011.

    Article  Google Scholar 

  • Zhang, T., Xie, Q., Park, B. J., Kim, Y., Broer, M., & Bohrnstedt, G. (2016). Computer Familiarity and Its Relationship to Performance in Three NAEP Digital-Based Assessments. AIR-NAEP working paper #01-2016. Washington, D.C.: American Institutes for Research.

  • Zhong, Z. J. (2011). From access to usage: The divide of self-reported digital skills among adolescents. Computers & Education,56, 736–746. https://doi.org/10.1016/j.compedu.2010.10.016.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Contributions

All authors conceptualized together the structure and the issue of the paper. JG drafted the background and method sections. NMS analyzed and interpreted the data and drafted the results section. JG and NMS drafted the discussion and conclusion sections together. MAG contributed with comments, both in text as in group discussions, to all sections. All authors read and approved the final manuscript.

Authors’ information

As National Project Manager for the PISA 2018 and PISA 2021 cycles, JG coordinates Dutch participation in both studies. NMS is part of the PISA 2018 and PISA 20201 Dutch national team and was involved in coding of constructed response items in the domain of reading. Both JG and NMS were not involved in PISA 2015. MAG was the scientific director of the Dutch Center Language Education, which is the National Centre for PISA 2018 and PISA 2021.

Corresponding author

Correspondence to Joyce Gubbels.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

See Table 4.

Table 4 Standardized and unstandardized effects of ICT resources, use, and attitudes on digitally assessed reading performance

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gubbels, J., Swart, N.M. & Groen, M.A. Everything in moderation: ICT and reading performance of Dutch 15-year-olds. Large-scale Assess Educ 8, 1 (2020). https://doi.org/10.1186/s40536-020-0079-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40536-020-0079-0

Keywords