Skip to main content

The relationship between differences in students’ computer and information literacy and response times: an analysis of IEA-ICILS data

Abstract

Background

Due to the increasing use of information and communication technology, computer-related skills are important for all students in order to participate in the digital age (Fraillon, J., Ainley, J., Schulz, W., Friedman, T. & Duckworth, D. (2019). Preparing for life in a digital world: IEA International Computer and Information Literacy Study 2018 International Report. Amsterdam: International Association for the Evaluation of Educational Achievement (IEA). Retrieved from https://www.iea.nl/sites/default/files/2019-11/ICILS%202019%20Digital%20final%2004112019.pdf). Educational systems play a key role in the mediation of these skills (Eickelmann. Second Handbook of Information Technology in Primary and Secondary Education. Cham: Springer, 2018). However, previous studies have shown differences in students’ computer and information literacy (CIL). Although various approaches have been used to explain these differences, process data, such as response times, have never been taken into consideration. Based on data from the IEA-study ICILS 2013 of the Czech Republic, Denmark and Germany, this secondary analysis examines to what extent response times can be used as an explanatory approach for differences in CIL also within different groups of students according to student background characteristics (gender, socioeconomic background and immigrant background).

Methods

First, two processing profiles using a latent profile analysis (Oberski 2016) based on response times are determined—a fast and a slow processing profile. To detect how these profiles are related to students’ CIL, also in conjunction with students’ background characteristics (socioeconomic and immigrant background), descriptive statistics are used.

Results

The results show that in the Czech Republic and Germany, students belonging to the fast processing profile have on average significantly higher CIL than students allocated to the slow processing profile. In Denmark, there are no significant differences. Concerning the student background characteristics in the Czech Republic, there are significant negative time-on-task effects for all groups except for students with an immigrant background and students with a high parental occupational status. There are no significant differences in Denmark. For Germany, a significant negative time-on-task effect can be found among girls. However, the other examined indicators for Germany are ambiguous.

Conclusions

The results show that process data can be used to explain differences in students’ CIL: In the Czech Republic and Germany, there is a correlation between response times and CIL (significant negative time-on-task effect). Further analysis should also consider other aspects of CIL (e.g. reading literacy). What becomes clear, however, is that when interpreting and explaining differences in competence, data should also be included that relates to the completion process during testing.

Introduction

Seeing how digitalization is becoming a more and more integral part of social and professional environments, competence in new technologies is becoming increasingly important for students (Fraillon et al. 2014, 2019; Gerick et al. 2017; Gerick 2018). In this context, the acquisition of computer and information literacy (CIL) as an interdisciplinary key competence is of particular relevance (Eickelmann 2018). However, empirical findings show differences in CIL between different groups of students when students’ background characteristics (gender, socioeconomic status, immigrant background) are taken into account (summarizing Aesaert and van Braak 2018; Fraillon et al. 2014, 2019). So far, predictors such as computer-based self-efficacy, computer use or computer experience have been used to explain these differences (Hatlevik et al. 2015; Luu and Freeman 2011; Punter et al. 2017). Besides determining these predictors of CIL, computer-based testing in the context of large-scale assessment studies also opens up possibilities to gather data on processing behaviour during testing—the so-called process data, such as response times, which can be used to model competencies and explain individual differences during testing (summarizing Goldhammer et al. 2017). Despite the potential to explain differences in competence among students, process data, such as response times, has never been used as an explanation regarding differences in students’ CIL. The present examination takes up this desideratum.

Following an overview of the theoretical classification, the state of research on differences in CIL and on the role of process data, a secondary analysis will be provided and include representative student data from the IEA-study ICILS 2013 (International Computer and Information Literacy Study; Fraillon et al. 2014) carried out in three Western European countries (the Czech Republic, Denmark and Germany) in which the students performed differently in CIL. This analysis is done to investigate the extent to which students’ response times during testing could be used as an explanation for differences in CIL and how CIL differs within different groups of students according to the response times. After the presentation of the results, they will be discussed and an outlook on future research will be given.

Theoretical background and state of research

CIL and student background characteristics—theoretical background

The ICILS 2013 framework model can be used as a theoretical model to locate CIL and student background characteristics such as gender, socioeconomic background and immigrant background (Fraillon et al. 2014). This framework model distinguishes between the learning antecedents and learning processes with regard to the learning outcomes and thus, the computer and information literacy of the students (Fraillon et al. 2014). The present model, therefore, represents a classic input-process-output model. It further assumes that the features or predictors at the antecedent level (input) directly affect the learning processes (process). These learning processes, in turn, are assumed to correlate with students' CIL – the learning outcomes (output)—and thus, have an impact on competences and are influenced by competences (Fraillon et al. 2014). Figure 1 shows a graphic representation of the abovementioned model.

Fig. 1
figure1

ICILS 2013 Framework Model (Fraillon et al. 2014)

The described model is used for the present analysis in order to be able to locate the student background characteristics at an input level (Fraillon et al. 2014). At the same time, the student competences (CIL) can be located at the outcome level. What is unaccounted for in the model but relevant for the present research is what takes place at the process level within the framework of competence testing. For this reason, an additional model of theoretical localisation will be used in the course of this article.

CIL and student background characteristics—state of research

The empirical findings regarding gender differences in digital literacy are not clear. For example, some studies show a performance advantage in favour of boys (Goldhammer et al. 2013; Morris and Trushell 2014). Among others, the so-called COMPED study (Computer in Education Study), which was carried out internationally by the IEA at the beginning of the 1990s and determined the skills of fifth and eighth graders in dealing with new technologies by means of a competence test, can be listed here (Pelgrum et al. 1993). The students’ competences were determined by means of standardized paper-based tests, the contents of which were designed to learn about students’ application knowledge and knowledge about the use of computers (Pelgrum et al. 1993). The results showed that boys had on average a higher skill level than girls in all participating countries (Pelgrum et al. 1993).

In contrast to studies that show a performance advantage in favour of boys, like the COMPED study, other studies do not indicate any gender differences at all (e.g., Punter et al. 2017). For example, in a study in Norway, in which more than 4.000 seventh graders were tested using a web-based module to determine their ‘digital competence’, no differences were identified between girls and boys (Hatlevik and Christophersen 2013).

In turn, other studies found that female students had on average a higher skill level than boys (ACARA 2015; Aesaert and van Braak 2015; Fraillon et al. 2014, 2019; Gebhardt et al. 2019; Thomson 2015). In Australia, for instance, it was observed that girls in sixth and tenth grade classes displayed a higher computer and information literacy than boys (ACARA 2015, 2018). Similar results can also be found in the US. In 2013, the ‘technology literacy’ of 1.300 eighth graders in the US was tested by means of a web-based performance test. The study’s results showed the girls performed better than the boys (Hohlfeld et al. 2013). Furthermore, the gap between the average girls’ and boys’ achievement levels increased between the two cycles (ACARA 2015, 2018). In a study of 378 sixth graders from 58 different elementary schools in Flanders, Aesaert and van Braak (2015), it was discovered, by means of a proficiency test on information and communication technology (ICT) skills, that girls have on average higher skills than boys. The ICILS 2013 and ICILS 2018 studies were also able to identify significantly higher levels of computer and information literacy skills for eighth-grade girls in comparison to the boys in all participating countries by using a computer-based proficiency test (Fraillon et al. 2014, 2019). Through secondary analysis using the ICILS 2013 international database and subscales, Punter et al. (2017) were also able to support the hypothesis that the girls outperformed the boys in the overall results and discovered performance differences in that computer-related skills were more in favour of boys and information-related skills were more in favour of girls.

A potential reason for the hitherto ambiguous findings with regard to the connection between gender and digital literacy described above could thus be the use of different constructs with which the mentioned studies (e.g. PISA 2009, COMPED, ACARA, ICILS) assess the students’ ICT skills. An explanation for varying results between different countries could be the manifold ways in which different school systems foster ICT skills. In addition, a distinction must be made between self-assessed and actually measured competences (Hatlevik et al. 2017) as studies are often not based on valid competence assessment but on self-assessed skills.

Regarding the socioeconomic background of the students, empirical findings also point to differences. In comparison to gender, these differences are more consistent. Empirical evidence suggests that students from more privileged families display higher digital competencies than those from less privileged homes. For example, studies have identified a link between the socioeconomic background of students and their acquired competencies concerning the use of computers and the internet (Zhong 2011; Zillien and Hargittai 2009). Studies also show that students from less privileged families only possess basic skills in using new technologies (Aesaert and van Braak 2015; Fraillon et al. 2014, 2019; Thomson, 2015). Furthermore, some reports point to a positive correlation between cultural capital and computer and information literacy (Fraillon et al. 2019; Hatlevik et al. 2015). Students whose parents have the highest occupational status have significantly higher digital competences than students whose parents have the lowest occupational status (Fraillon et al. 2019; Thomson 2015). The ACARA (Australian Curriculum Assessment and Reporting Authority) study shows similar findings for Australia. Among sixth and tenth graders whose parents have both a higher occupational status and level of education, higher competencies in using information and communication technologies have been identified in comparison to their classmates, whose parents have both a lower occupational status and level of education (ACARA 2018).

Similar findings exist concerning the immigrant background of students. In terms of access to and the use of computers, studies can be found that reveal no or only minor differences among students with an immigrant background (Bonfadelli et al. 2007; D'Haenens 2003); however, empirical findings show that students without an immigrant background have higher digital competences than those with an immigrant background (Fraillon et al. 2019; Luu and Freeman 2011). In addition to the parents’ country of origin, the language spoken at home can be another indicator for an immigrant background: Study findings suggest that the language spoken at home influences school performance (summarized by Fraillon et al. 2019). This is related to the fact that students from immigrant families often do not have sufficient knowledge of the language of instruction (Fraillon et al. 2019). A connection between the language spoken at home and CIL has also been found in ICILS 2013 and 2018 (Fraillon et al. 2014, 2019). The findings of ICILS 2013 and 2018 for Germany also indicate that students whose families speak the test language in the home environment attain higher computer and information literacy than students who speak a different language at home (Fraillon et al. 2014, 2019). In contrast, the results of the ACARA study for Australia show that sixth-grade students who speak a language other than the test language at home have significantly a higher skill level in using information and communication technologies. For the tenth graders, however, there is no significant difference (ACARA 2018). At the same time, tenth grade students who were born in Australia demonstrate a higher skill level than their classmates who were born abroad (ACARA 2018).

Predictors such as computer-based self-efficacy, computer experience and students’ computer use have often been taken into consideration in order to explain such differences in CIL (Hatlevik et al. 2018; Livingstone and Helpster 2010; Rohatgi et al. 2016).

The role of response times—theoretical background

In addition to the abovementioned predictors (e.g. computer-based self-efficacy, computer experience and computer use), computer-based testing in the context of large-scale assessment enables the determination of process data such as response times, which can describe individual behavioural differences during the process of task completion and thereby task success (summarizing Goldhammer et al. 2017). Although process data, such as response times (the time taken to complete a task), allows the use of personal differences in behaviour as an explanatory approach for competency modelling, it can also be used to explain performance differences in this context (ibid.).

Since the process data is not explicitly considered in the already presented ICILS model, another model according to Naumann (2012) is also used to theoretically embed the presented analysis. This model also represents a type of input-process-output model. In this framework model it is assumed that the completion process during testing (process), which is influenced by person-level characteristics and task-level characteristics (input), has a direct influence on the result of task completion (output) (Goldhammer et al. 2017; Naumann 2012). Figure 2 shows a graphic representation of the model.

Fig. 2
figure2

modified by Goldhammer (2013)

Theoretical Model based on Naumann (2012)

Against the background of the ICILS model described above, both theoretical models can be merged for the present work: at the input level, it is possible to locate the student background characteristics which, in the context of the present analysis, are assumed to have a direct influence on the level of the completion process in which the response times are located. From the level of the completion process, a direct influence on the results of task completion can be assumed. Here again, parallels to the ICILS model can be seen, from which, in turn, the reciprocal relationship between the process level and the output level can be adapted in the course of further analyses.

The role of response times—state of research

Empirical findings indicate correlations between response times and the processing success. These relationships can also be referred to as "time-on-task effects" where positive “time-on-task effects” (long response times with high processing success) and negative “time-on-task effects” (long response times with low processing success) can be distinguished (Goldhammer et al. 2014, 2017; Naumann and Goldhammer 2017). With regard to problem-solving tasks in PIAAC (Programme for the International Assessment of Adult Competencies), Goldhammer et al. (2014) determined positive "time-on-task effects" as well as negative “time-on-task effects” (Goldhammer et al. 2014). In PIAAC, they used “a specific concept of problem solving […]; it refers to solving problems in technology-rich environments” (Goldhammer et al. 2014, p. 10). The study by Stelter et al. (2015), for example, used data from the PIAAC study and built upon the research of Goldhammer et al. (2014). They analysed the specific part of the time spent on basic subtasks of PIAAC problem-solving tasks which could be solved through automated cognitive processing (Goldhammer et al. 2017; Stelter et al. 2015). The concept behind this study was that as soon as basic subtasks in problem-solving tasks were performed through automated processing, cognitive skills became available and therefore, benefited task processing and thus, the processing success (Goldhammer et al. 2017). As a result, negative “time-on-task effects” could be determined for problem-solving tasks. For reading tasks, there were correlations between response times as well as the results. Thus, positive as well as negative “time-on-task effects” could be determined during reading tasks (Goldhammer et al. 2014; Su 2017). Even within one study, different effects could be detected: firstly, a positive “time-on-task-effect” for slow digital readers in difficult tasks and tasks with high navigation requirements was identified. At the same time, a negative “time-on-task” effect could be detected for simple tasks with low navigation requirements (Naumann and Goldhammer 2017).

Research desideratum and research questions

Despite the abovementioned potential of process data, such as response times for analysing behaviour during testing to explain differences in competence, there is a lack of research on the extent to which response times can explain differences in students’ computer and information literacy (CIL).

Therefore, the present analysis focuses on the following research questions:

1. How do the response times relate to the CIL of students in Denmark, Germany and the Czech Republic?

2. How does CIL differ in terms of response times within different groups of students according to students’ background characteristics (gender, socioeconomic background and immigrant background)?

Data and methods

Sample

For the present secondary analysis, the representative student data of the Czech Republic (N = 3.066), Denmark (N = 1.767) and Germany (N = 2.225) from the IEA-study ICILS 2013 (Fraillon et al. 2015) is used. The uniqueness of the ICILS 2013 study is that students’ competencies in using information and communication technologies, or their computer and information literacy (CIL), could be assessed by means of computer-based performance tests for the first time (Fraillon et al. 2014). In addition to competence testing, the students took part in a written survey in which among others background information about the students such as gender, socioeconomic and immigrant background and other contextual information could be recorded. In addition to the framework concept which was based on the literacy concept, test instruments were developed for the survey to allow a computer-based determination of CIL.

Country selection

The country selection incorporated Western European countries in which student performance in CIL differed. As Germany was in the middle of the international field in terms of students’ CIL in ICILS 2013 (M = 523 points, SE = 2.40), the Czech Republic has also been used as a reference country, since it was one of the top performers in the study (M = 553 points, SE = 2.10). Students in Denmark (M = 542 points, SE = 3.50), on the other hand, performed worse than students in the Czech Republic, but better than students in Germany.

Variables

Firstly, the so-called timing-items are used and represent the response times (in seconds) for the individual test tasks distributed over the four modules for each student. Student background characteristics (gender, socioeconomic background and immigrant background), collected through the questionnaires, are also utilised for the present analysis of the timing items. The gender is operationalized by the question 'Are you a girl or a boy?'.

Previous analyses of the ICILS 2013 data (e.g. Hatlevik et al. 2018) also show that the cultural capital determined by the number of books in the household can be used as an indicator of the socioeconomic background. For this reason, this indicator is also used in this paper (high cultural capital = more than 100 books; low cultural capital = 100 books or less). Furthermore, the occupation of the parents is operationalised in context of the International Socio-Economic Index of Occupational Status (ISEI; Ganzeboom et al. 1992). According to this indicator, low values suggest a low socio-economic background and high values a high socio-economic background. Therefore, the following groups are formed consistent with previous ICILS 2013 analyses (e.g. Fraillon et al. 2014): low parental occupational status (less than 40 points), medium parental occupational status (40 to 59 points) and high parental occupational status (60 points or more).

The immigrant background is, on the one hand, represented by the language spoken at home whereby, a distinction is made whether the test language is a language used in the home environment or another language. On the other hand, the immigrant background is represented by the parents’ country of birth. This has resulted in the following categories: no parent born abroad, one parent born abroad and both parents born abroad. An overview of the student background characteristics and the corresponding computer and information literacy distribution is shown in Table 1.

Table 1 Overview of descriptives of background characteristics related to computer and information literacy (CIL) of different groups of 8th grade students in Czech Republic, Denmark and Germany (see also Fraillon et al., 2014)

In addition to the background characteristics and the timing-items, the five plausible values of the performance test which map CIL are used for further analysis. Furthermore, the student weight is included in the analysis.

Methods

The selected timing-items were also prepared for the further analysis with the so-called z-score standardization. Due to the nature of the data, direct comparability is not possible. The data must, therefore, first be prepared in such a way that the available response times for the respective tasks can be compared. To calculate the z-scores, the average of all values must be subtracted from each value before dividing it by the standard deviation. This calculation is depicted in the following formula (Mohamad and Usman 2013, p. 3300):

$${x}_{ij}=Z\left({x}_{ij}\right)=\frac{{x}_{ij}-{\stackrel{-}{x}}_{j}}{{\sigma }_{j}}$$

By default, the variable that exists after the z-score standardization always has a mean value of 0 and a standard deviation of 1 (for example Mohamad and Usman 2013).

As a first step to make students’ response times more tangible and comparable, a latent profile analysis (LPA; Oberski 2016) using the software Mplus (Muthén and Muthén 2012) is carried out to identify possible processing profiles. The student weight is also used for the complete analysis to approximate the sample to the population and thus, to prevent possible distortions in the results (Jung and Carstens 2015). To answer the research questions, descriptive statistics are applied using the processing profiles to determine the extent to which differences in CIL can be explained by response times. In addition, descriptive statistics are used to measure how CIL varies in terms of response times within different groups of students due to student background characteristics. This is done using the IEA-IDB-Analyzer (Mirazchiyski 2015).

Results

Results RQ1: processing profiles and CIL

Based on the parsimony principle, the interpretability, the mean class membership probabilities and the entropy value as criteria for evaluating the model quality (cf. Nylund et al. 2007; Tein et al. 2013), two profiles could be determined while profiles three to six are irrelevant given the figures related to the profile size and the associated interpretability shown in the table (cf. Table 2).

Table 2 Latent profile analysis—criteria for evaluating the model quality

The first profile can be labelled as the "fast processing profile". This profile includes 81.07% of the students. The second profile, which can also be referred to as the "slow processing profile", only accounts for 18.93 percent of the students. While students in the first profile completed the tasks on average at a faster pace, the students in the second profile needed on average more time to complete the tasks. Exceptions are only found in so-called authoring tasks, also referred to as "big tasks"; each test module contains one (Fraillon et al. 2014, 2019). In comparison to the other tasks (i.e. multiple choice) of the respective test module, an authoring task is a more complex task type as information products (i.e. a presentation) have to be created by the test participants (Fraillon et al. 2014, 2019). The students who work on average at a faster pace need more time for these specific tasks. Students allocated to the slow processing profile, on the other hand, completed these tasks at a faster processing speed (cf. Figure 3). In the Czech Republic, most of the students can be fall into the first profile (71.32%), while 28.68% go in the second profile. In Denmark, 88.10% are part of the first profile, while 11.90% fit the second profile. In Germany, 82.51% of the students are in the first profile and 17.49% can be allocated to the second profile (cf. Fig. 3).

Fig. 3
figure3

Latent Profile Analysis: Processing profiles. aDid not meet sample requirements. b “Met guidelines for sampling participation rates only after replacement schools were included” (Fraillon et al. 2014, S. 112)

With regard to the first research question, the analysis shows that the Czech and German students belonging to the first profile have on average a significantly higher CIL than the students belonging to the second profile (Czech Republic profile 1: M = 558 points; profile 2: M = 541 points/Germany profile 1: M = 526 points; profile 2: M = 510 points / p < 0.05). In Denmark, there is no such significant difference between the students (profile 1: M = 542 points, /profile 2: M = 536 points). Response times can be used to explain students’ CIL in the Czech Republic and Germany. Therefore, we can speak of a so-called significant negative time-on-task effect (cf. Table 3).

Table 3 Processing profiles and computer and information literacy (CIL)

Results RQ2: processing profiles and CIL regarding students’ background characteristics

Regarding the second research question, the following results are shown with regard to the student background characteristic of gender (cf. Table 4): In the Czech Republic as well as in Germany the girls assigned to the fast processing profile (Czech Republic: M = 565 points, 67.94%, Germany: M = 536 points, 80.91%) display significantly higher computer and information literacy than the girls from the slow processing profile (Czech Republic: M = 547 points, 32.06%, Germany: M = 517, 19.09%). In Denmark, there is no such significant difference among the girls (profile 1: M = 550 points, 87.23%; profile 2: M = 541 points, 12.77%). In the Czech Republic, a significantly higher computer and information literacy can be seen for the boys assigned to the fast-processing profile (M = 552 points, 74.76%) in comparison to the boys allocated to the slow processing profile (M = 533 points, 25.24%). A similar significant difference can neither be determined for the boys in Denmark nor in Germany (Denmark profile 1: M = 535 points, 88.94%; profile 2: M = 531 points, 11.06%/Germany profile 1: M = 518 points, 84.00%; profile 2: M = 503 points, 16.00%).

Table 4 Processing profiles, computer and information literacy (CIL) and gender

So-called negative time-on-task effects can be seen with regard to the socioeconomic background (cf. Table 5); in this case, the cultural capital (number of books in the household). In the Czech Republic, a negative time-on-task effect can be observed both for students with high cultural capital (profile 1: M = 576 points, 73.27%; profile 2: M = 560 points, 26.73%) as well as for students with low cultural capital (profile 1: M = 548 points, 70.00%; profile 2: M = 530 points, 30.00%). Such a significant negative time-on-task effect is also evident in Germany concerning students with high cultural capital (profile 1: M = 552 points, 82.43%; profile 2: M = 538 points, 17.57%), but not among high school students with low cultural capital (profile 1: M = 507 points, 83.13%; profile 2: M = 491 points, 16.87%). In Denmark, neither students with high cultural capital (profile 1: M = 564 points, 88.71%; profile 2: M = 561 points, 11.29%) nor students with low cultural capital (profile 1: M = 531 points, 88.16%; profile 2: M = 526 Points, 11.84%) show a significant correlation between their processing profile / processing time and their computer and information literacy (cf. Table 5).

Table 5 Processing profiles, computer and information literacy (CIL) and cultural capital

Results concerning the parental occupation as a further indicator for socioeconomic background can be found in Table 6. For the Czech Republic a significant negative time-on-task effect for students from families with a HISEI of less than 40 points is visible: The 68.08% of the students who belong to the fast profile have a higher computer and information literacy (M = 540 points) in comparison to the 31.92% of students in the slow profile (M = 519 points). There is also a significant negative time-on-task effect among students with a HISEI of 40 to 59 points (profile 1: M = 564 points, 72.43%; profile 2: 550 points, 27.57%). However, students with a HISEI of 60 points or more show no significant differences in the profiles. Furthermore, there are no significant differences regarding any of the HISEI categories in Denmark. Nonetheless, a positive though not significant time-on-task effect can be noted here for students with a HISEI of 60 points or more. The 88.91% of students belonging to the fast profile achieved an average of 562 points and thus, fewer points than the 11.09% students belonging to the slow profile (565 scale points). For Germany, a significant negative time-on-task effect can be identified for students with a HISEI less than 40 points (profile 1: M = 506 points, 79.54%; profile 2: M = 483 points, 20.46%).

Table 6 Processing profiles, computer and information literacy (CIL) and HISEI

With regard to the immigrant background, as determined by the language spoken at home, the following results are shown (cf. Table 7): In the Czech Republic it can be ascertained that the students without an immigrant background (the at-home spoken language is the same as the test language) who belong to the fast-paced processing profile have a higher computer and information literacy (M = 559 points, 71.51%) than those who are allocated to the slow profile (M = 542 points, 28.49%). For students with an immigrant background (the at-home spoken language differs from the test language), there is no significant difference in the Czech Republic (profile 1: M = 548 points, 64.19%; profile 2: M = 529 points, 35.81%). Likewise, in Denmark, there are no significant differences regarding the students with an immigrant background (profile 1: 502 points, 84.49%; profile 2: M = 495 points, 15.51%) and without an immigrant background (profile 1: M = 546 points, 88.7%; profile 2: M = 544 points, 11.3%). In Germany, as in the Czech Republic, only those students whose families use the test language in the home environment show a significant negative time-on-task effect (profile 1: M = 534 points, 83.16%; profile 2: M = 520 points, 16.84%). There was no significant difference for young people without an immigrant background (profile 1: M = 491 points, 80.01%; profile 2: M = 473 points, 19.99%).

Table 7 Processing profiles, computer and information literacy (CIL) and language spoken in the home environment

In addition to the language spoken at home, the parents’ country of birth is included in the analysis regarding the immigrant background (cf. Table 8). The Czech Republic displays, as with the language spoken at home, a significant negative time-on-task effect for students without an immigrant background: 71.57% of the students whose parents were not born abroad belong to the fast profile (M = 559) while the other 28.43% belong to the slow profile with lower CIL (M = 542). For Denmark, there are no significant differences between the profiles, but a second not significant positive time-on-task effect can be identified: 13.32% of the students whose parents were both born abroad fall into to the slow profile. They achieve on average 507 points in CIL while the other 86.68% in the fast profile achieve 500 points. Here, however, the small number of students in the slow profile should be noted. For Germany, a significant negative time-on-task effect can be identified for students whose parents were both born abroad: In this group, 80.45% of the students can be allocated to the fast profile with an average computer and information literacy of 504 points; the remaining 19.55% are assigned to the slow profile and display fewer scale points (M = 478) and therefore, a lower CIL.

Table 8 Processing profiles, computer and information literacy (CIL) and parents’ country of birth

Taking into account the student background characteristics of gender, socioeconomic background (determined here via cultural capital/number of books at home and HISEI) and the immigrant background (determined here via language use at home and parents’ country of birth), the summarized results regarding the second research question for the three countries are presented in Table 9.

Table 9 Time-on-task effects regarding students’ background characteristics in the Czech Republic, Denmark and Germany

In the Czech Republic, there are significant differences in CIL for all groups except for students with an immigrant background (language used at home is different from the test language and students with one parent or both parents who were born abroad) and students with a high parental occupational status. Thus, there is a significant negative time-on-task effect for these groups. However, as shown in Table 9 there are no significant differences in the processing profiles and CIL for Denmark, even when students’ background characteristics are considered. For Germany, a significant negative time-on-task effect can be found for the girls. The other examined indicators for Germany are ambiguous: the students with a higher cultural capital show a significant negative time-on-task effect as do students with parents with a low occupational status. Additionally, significant negative time-on-task can be highlighted for the students who use the test language at home and those students whose parents were both born abroad. No significant differences in CIL can be found for the other groups.

Discussion and conclusions

The ICILS 2013 study identified differences in the CIL of students, particularly with regard to gender, socioeconomic background and immigrant background (Fraillon et al. 2014). The present analysis utilises the potential of process data to explain individual differences in competence tests and examines the relationship between CIL and response times on the basis of response times in IEA-ICILS 2013. For this purpose, two processing profiles (the fast processing profile and the slow processing profile) can be determined using a latent profile analysis. The students who belong to the fast profile finish the tasks on average at a fast pace and those who belong to the slow profile finish the tasks on average at a slow pace. The only exceptions are the so-called large tasks, in which the students who belong to the fast profile finish these tasks on average slower. A more intensive preoccupation with a task could be associated with greater care and thus, increase the probability of a correct answer. The students in the slow profile, on the other hand, finish these tasks faster on average, maybe because those students, who already know the answers, also give there a correct answer faster. This is a more complex task type in which information products (i.e. a presentation) have to be created by the test participants (Fraillon et al. 2014, 2019). Thus, differences in response times with regard to task types in ICILS 2013 become apparent, which should be further analysed in the future.

On the basis of the profiles, it was found that the students from the Czech Republic and Germany who belong to the fast profile have significantly higher CIL than the students belonging to the slow profile and thus a so-called significant negative time-on-task effect (Goldhammer et al. 2014, 2017; Naumann and Goldhammer 2017). Only in Denmark, there is no significant difference. In terms of student background characteristics, significant negative time-on-task effects can be observed in the Czech Republic for the majority of the groups (girls, boys, students with higher and lower cultural capital, students with low and medium occupational status and students without an immigrant status), except for students with an immigrant background and a high parental occupational status. While in Denmark no significant effects are noted, there are significant negative time-on-task effects in Germany for girls. The results regarding the socioeconomic and immigrant background in Germany meanwhile, are not as clear as there are significant negative time-on-task effects for students with high cultural capital, such results also applied to students with a low parental occupational status. Furthermore, significant negative time-on-task effects can be found for students who speak the same language as the test-language at home and students whose parents were both born abroad. Especially against the background of these results, it seems necessary to go into greater depth in further analyses and to look at the tasks and results along with the various indicators in smaller steps in order to thoroughly interpret the results described.

Against the background of these results, it can be discussed to what extent the response times can actually explain the differences in the students’ CIL along the student background characteristics gender, socioeconomic background and immigrant background. Although in the Czech Republic and Germany, there are clear correlations between CIL and the processing profiles, the extent to which this result can be used as the sole explanatory approach for the disparities described in the CIL can be questioned in association with student background characteristics. Future studies may also explore the degree to which CIL country-specific curricular requirements play a role.

Furthermore, the methodological approach should be discussed: importantly, the results for the large tasks make it clear that the latent profile analysis is only a first methodological approach in order to make the response times tangible. In further analyses, it therefore, seems logical to focus on the different types of tasks. Further methodological approaches are conceivable for subsequent analyses in this context, which have also been successfully used in previous investigations of response times regarding time-on-task effects (e.g. the Generalized Linear Mixed Model (GLMM) framework; Goldhammer et al. 2014). With regard to the selection of the profiles, further analyses must also check whether further profiles can be determined based on the quality criteria in a more fine-grained step-by-step evaluation by modules or task types. In addition, due to the quality criteria listed, which could also be applied to other profile solutions, it makes sense to carry out a different methodological approach, such as a cluster analysis, in order to empirically support the choice of the two-profile solution. Additionally, it would be viable to include the data from the second ICILS cycle, although it should be noted that only Denmark and Germany participated in the study again. Regarding the results of Denmark in comparison to the Czech Republic and Germany, it must be discussed why there are no significant differences between the two processing profiles, not even with regard to student background characteristics. One reason might be the small sample size resulting from the split of profiles and single background characteristics. However, when interpreting the results for the three countries, it should be noted that Denmark does not meet the sampling requirements, which may have affected the results. In the context of the present analyses, low-performing countries are not taken into account as indications should be generated as to how variations in good performance can be explained. Thus, for further analysis, in terms of country selection, it may be useful to select additional participating countries with a very low level of performance or other countries with a similar level of performance in order to improve comparability and determine how this relates to response times.

It seems sensible to include additional predictors in further analyses in order to explain the differences found between and across the countries. Particularly against the background of the abovementioned research, findings on the relationship between response times, reading tasks (Goldhammer et al. 2014; Su 2017) and other predictors can be used to explain differences in students’ CIL (Hatlevik et al. 2018; Livingstone and Helsper 2010; Rohatgi et al. 2016).

The results of these initial analysis based on the response times of the computer-based test modules clearly reveal that in order to explain and interpret differences in CIL, it makes sense to consider extra- and in-school-based conditions as well as use process data, which explains the behaviour during the test. In addition to a better understanding of how response times are related to CIL, particularly regarding differences between varying groups of students, the research presented in this article also offers potential insight for school practice. In the context of individual support and individualized learning processes, process data will increasingly play a role in the future (e.g. Wang et al. 2018), especially in the context of diagnostic measures. Thus, the further analysis of process data, such as response times, for future research in the context of school development processes is of particular relevance. The analysis of process data is also becoming increasingly important for the (further) development of competence testing within the framework of school performance studies. The time frame for the processing of tasks must be put into question as well as the task design.

Availability of data and materials

The data of ICILS 2013 are publicly available on the IEA website (https://www.iea.nl/data-tools/repository/icils).

Abbreviations

ACARA:

Australian Curriculum Assessment and Reporting Authority

CIL:

Computer and information literacy

COMPED:

Computer in Education

ICILS:

International Computer and Information Literacy Study

IEA:

International Association for the Evaluation of Educational Achievement

PIAAC:

Programme for the International Assessment of Adult Competencies

References

  1. ACARA. (2015). National Assessment Program – ICT Literacy. Years 6 & 10. Report 2014. Sydney: ACARA. Retrieved from https://www.nap.edu.au/_resources/D15_8761__NAP-ICT_2014_Public_Report_Final.pdf

  2. ACARA. (2018). NAP Sample Assessment ICT Literacy. Years 6 and 10. Sydney: ACARA. Retrieved from  https://www.nap.edu.au/docs/defaultsource/default-document-library/2017napictlreport_final.pdf?sfvrsn=b5696d5e_2

  3. Aesaert, K. & van Braak, J. (2018). Information and Communication Competences for Students. In J. Voogt; G. Knezek, R. Christensen & K.-W. Lai (Eds.), Second Handbook of Information Technology in Primary and Secondary Education (pp. 255–269). Cham: Springer. https://doi.org/10.1007/978-3-319-53803-7

  4. Aesaert, K., & van Braak, J. (2015). Gender and socioeconomic related differences in performance based ICT competences. Computers & Education, 84, 8–25. https://doi.org/10.1016/j.compedu.2014.12.017

    Article  Google Scholar 

  5. Eickelmann, B. (2018). Section introduction: international policies on information and communication technology in primary and secondary schools. In J. Voogt, G. Knezek, R. Christensen & K.-W. Lai (Eds.), Second Handbook of Information Technology in Primary and Secondary Education. Cham: Springer. https://doi.org/10.1007/978-3-319-53803-7

  6. Gerick, J., Eickelmann, B. & Bos, W. (2017). The international computer and information literacy study from a European perspective: introduction to the special issue. European Educational Research Journal, 16(6), 707–715. https://doi.org/10.1177/1474904117735417

    Article  Google Scholar 

  7. Bonfadelli, H., Bucher, P., & Piga, A. (2007). Use of old and new media in ethnic minority youth in Europe with a special emphasis on Switzerland. Communications, 32(2), 141–170. https://doi.org/10.1515/COMMUN.2007.010

    Article  Google Scholar 

  8. D’Haenens, L. (2003). ICT in multicultural society The Netherlands: A context for sound multiform media policy? Gazette, 65(4–5), 401–421. https://doi.org/10.1177/0016549203654006

    Article  Google Scholar 

  9. Fraillon, J., Ainley, J., Schulz, W., Friedman, T. & Duckworth, D. (2019). Preparing for life in a digital world: IEA International Computer and Information Literacy Study 2018 International Report. Amsterdam: International Association for the Evaluation of Educational Achievement (IEA). Retrieved from https://www.iea.nl/sites/default/files/2019-11/ICILS%202019%20Digital%20final%2004112019.pdf

  10. Fraillon, J., Schulz, W., Friedman, T., Ainley, J., & Gebhardt, E. (2015). ICILS 2013 Technical Report. Amsterdam: IEA Secretariat. https://doi.org/10.15478/uuid:b9cdd888-6665-4e9f-a21e-61569845ed5b

    Google Scholar 

  11. Fraillon, J., Ainley, J., Schulz, W., Friedman, T., & Gebhardt, E. (2014). Preparing for Life in a Digital Age. The IEA International Computer and Information Literacy Study International Report. Melbourne: Australian Council for Educational Research (ACER). https://doi.org/10.1007/978-3-319-14222-7

    Article  Google Scholar 

  12. Ganzeboom, H. B. G., de Graaf, P. M., & Treiman, D. J. (1992). A standard international socio-economic index of occupational status. Social Science Research, 21(1), 1–56.

    Article  Google Scholar 

  13. Gebhardt, E., Thomson, S., Ainley, J., & Hillman, K. (2019). Gender differences in computer and information literacy. An In-depth Analysis of Data from ICILS. Cham: Springer. https://doi.org/10.1007/978-3-030-26203-7_1

    Article  Google Scholar 

  14. Gerick, J. (2018). School Level Characteristics and Students’ CIL in Europe—a latent class analysis approach. Computers & Education, 120, 160–171. https://doi.org/10.1016/j.compedu.2018.01.013

    Article  Google Scholar 

  15. Goldhammer, F. (2013). Prozessbezogene Verhaltensdaten für Rückmeldung in digitalen Lernumgebungen. [Process-related behavioral data for feedback in digital learning environments.] Deutsches Institut für Internationale Pädagogische Forschung (DIPF). Zentrum für internationale Bildungsvergleichsstudien (ZIB).

  16. Goldhammer, F., Naumann, J., Rölke, H., Stelter, A. & Tóth, K. (2017). Relating Product Data to Process Data from Computer-Based Competency Assessment. In D. Leutner, J. Fleischer, J. Grünkorn & E. Klieme (Eds.), Competence Assessment in Education. Research, Models and Instruments (pp. 407–425). Heidelberg: Springer. https://doi.org/10.1007/978-3-319-50030-0

  17. Goldhammer, F., Naumann, J., Stelter, A., Rölke, H., Tóth, K., & Klieme, E. (2014). The time-on-task effect in reading and problem solving is moderated by item difficulty and ability: Insights from computer-based large-scale assessment. Journal of Educational Psychology, 106, 608–626. https://doi.org/10.1037/a0034716

    Article  Google Scholar 

  18. Goldhammer, F., Naumann, J., & Keßel, Y. (2013). Assessing individual differences in basic computer skills: Psychometric characteristics of an interactive performance measure. European Journal of Psychological Assessment, 29, 263–275. https://doi.org/10.1027/1015-5759/a000153

    Article  Google Scholar 

  19. Hatlevik, O. E., Throndsen, I., Loi, M., & Gudmundsdottir, G. B. (2018). Students’ ICT self-efficacy and computer and information literacy: Determinants and relationships. Computers & Education, 118, 107–119. https://doi.org/10.1016/j.compedu.2017.11.011

    Article  Google Scholar 

  20. Hatlevik, O. E., Scherer, R., & Christophersen, K.-A. (2017). Moving beyond the study of gender differences: An analysis of measurement invariance and differential item functioning of an ICT literacy scale. Computers & Education, 113, 280–293. https://doi.org/10.1016/j.compedu.2017.06.003

    Article  Google Scholar 

  21. Hatlevik, O. E., Ottestad, G., & Throndsen, I. (2015). Predictors of digital competence in 7th grade: a multilevel analysis. Journal of Computer Assisted Learning, 31, 220–231. https://doi.org/10.1111/jcal.12065

    Article  Google Scholar 

  22. Hatlevik, O. E., & Christophersen, K.-A. (2013). Digital competence at the beginning of upper secondary school: Identifying factors explaining digital inclusion. Computers & Education, 63, 240–247. https://doi.org/10.1016/j.compedu.2012.11.015

    Article  Google Scholar 

  23. Hohlfeld, T. N., Ritzhaupt, A. D., & Barron, A. E. (2013). Are gender differences in perceived and demonstrated technology literacy significant? It depends on the model. Educational Technology Research and Development, 61(4), 639–663. https://doi.org/10.1007/s11423-013-9304-7

    Article  Google Scholar 

  24. Jung, M., & Carstens, R. (Eds.). (2015). ICILS 2013 User Guide for the International Database. Amsterdam: IEA. https://doi.org/10.15478/uuid:73a9f018-7b64-4299-affc-dc33fe57f3e1

    Google Scholar 

  25. Livingstone, S., & Helsper, E. (2010). Balancing opportunities and risks in teenagers’ use of the internet: the role of online skills and internet self-efficacy. New Media & Society, 12(2), 309–329. https://doi.org/10.1177/1461444809342697

    Article  Google Scholar 

  26. Luu, K., & Freeman, J. G. (2011). An analysis of the relationship between information and communication technology (ICT) and scientific literacy in Canada and Australia. Computers & Education, 56(4), 1072–1082. https://doi.org/10.1016/j.compedu.2010.11.008

    Article  Google Scholar 

  27. Mirazchiyski, P. (2015). Analyzing ICILS 2013 data using the IEA IDB Analyzer. In M. Jung & R. Carstens (Eds.), ICILS 2013 User Guide for the International Database (pp. 49–86). Amsterdam: IEA. https://doi.org/10.15478/uuid:73a9f018-7b64-4299-affc-dc33fe57f3e1

  28. Mohamad, I. B., & Usman, D. (2013). Standardization and its effects on k-means clustering algorithm. Research Journal of Applied Science, Engineering and Technology, 6(17), 3299–3303. https://doi.org/10.19026/rjaset.6.3638

    Article  Google Scholar 

  29. Morris, D., & Trushell, J. (2014). Computer programming, ICT and gender in the classroom: a male-dominated domain or a female preserve? Research in teacher education, 4(1), 4–9. Retrieved from https://www.uel.ac.uk/wwwmedia/microsites/riste/Article-1-David-Morris-and-John-Trushell.pdf

  30. Muthén, L. K. & Muthén, B. O. (2012). Mplus Statistical Analysis With Latent Variables. User’s Guide. Seventh Edition ed. Los Angeles, CA: Muthén & Muthén. Retrieved from https://www.statmodel.com/download/usersguide/Mplus%20Users%20Guide%20v6.pdf

  31. Naumann, J. (2012). Belastungen und Ressourcen beim Lernen aus Text und Hypertext. [Costs and resources in learning from text and hypertext]. (Habilitation thesis). Goethe Universität Frankfurt, Frankfurt, Germany. https://doi.org/10.13140/RG.2.2.34203.46888

  32. Naumann, J., & Goldhammer, F. (2017). Time-on-task effects in digital reading are non-linear and moderated by persons’ skills and tasks’ demands. Learning and Individual Differences, 53, 1–16. https://doi.org/10.1016/j.lindif.2016.10.002

    Article  Google Scholar 

  33. Nylund, K. L., Asparouhov, T., & Muthén, B. O. (2007). Deciding on the number of classes in latent class analysis and growth mixture modeling: A monte carlo simulation study. Structural Equation Modeling, 14(4), 535–569. https://doi.org/10.1080/10705510701575396

    Article  Google Scholar 

  34. Oberski, D. (2016). Mixture Models: Latent Profile and Latent Class Analysis. In J. Robertson & M. Kaptein (Eds.), Modern Statistical Methods for HCI (pp. 275–287). Switzerland: Springer. https://doi.org/10.1007/978-3-319-26633-6

  35. Pelgrum, W. J., Reinen, I. A. M. J. & Plomp, T. (1993). Schools, teachers, students and computers: A cross-national perspecitve. IEA-Comped Study Stage 2. Enschede: University of Twente. Retrieved from https://files.eric.ed.gov/fulltext/ED372734.pdf

  36. Punter, R. A., Meelissen, M. R., & Glas, C. A. (2017). Gender differences in computer and information literacy: An exploration of the performances of girls and boys in ICILS 2013. European Educational Research Journal, 16(6), 762–780. https://doi.org/10.1177/1474904116672468

    Article  Google Scholar 

  37. Rohatgi, A., Scherer, R., & Hatlevik, O. E. (2016). The role of ICT self-efficacy for students’ ICT use and their achievement in a computer and information literacy test. Computers & Education, 102, 103–116. https://doi.org/10.1016/j.compedu.2016.08.001

    Article  Google Scholar 

  38. Stelter, A., Goldhammer, F., Naumann, J., & Rölke, H. (2015). Die Automatisierung prozeduralen Wissens: Eine Analyse basierend auf Prozessdaten [The automation of procedural knowledge: An analysis based on process data]. In J. Stiller & C. Laschke (Eds.), Berlin-Brandenburger Beitrage zur Bildungsforschung 2015: Herausforderungen, Befunde und Perspektiven Interdisziplinärer Bildungsforschung (pp. 111–132). Frankfurt am Main: Lang. https://doi.org/10.3726/978-3-653-04961-9

  39. Su, S. (2017). Incorporating Response Times in Item Response Theory Models of Reading Comprehension Fluency. University of Minnesota Digital Conservancy. Retrieved from https://search.proquest.com/docview/2013525225?accountid=13049

  40. Tein, J.-Y., Coxe, S., & Cham, H. (2013). Statistical power to detect the correct number of classes in latent profile analysis. Structural Equation Modeling, 20(4), 640–657. https://doi.org/10.1080/10705511.2013.824781

    Article  Google Scholar 

  41. Thomson, S. (2015). Australian Students in a Digital World. Policy Insights, Issue 3. Melbourne: ACER. Retrieved from https://research.acer.edu.au/cgi/viewcontent.cgi?article=1002&context=policyinsights

  42. Wang, S., Zhang, S., Douglas, J. & Culpepper, S. (2018). Using Response Times to Assess Learning Progress: A Joint Model for Responses and Response Times. Measurement: Interdisciplinary Research and Perspectives, 16(1), 45–58. https://doi.org/10.1080/15366367.2018.1435105

  43. Zhong, Z.-J. (2011). From access to usage: The divide of self reported digital skills among adolescents. Computers and Education, 56(3), 736–746. https://doi.org/10.1016/j.compedu.2010.10.016

    Article  Google Scholar 

  44. Zillien, N., & Hargittai, E. (2009). Digital distinction: Status-specific types of internet usage. Social Science Quarterly, 90(2), 274–291. https://doi.org/10.1111/j.1540-6237.2009.00617.x

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

There was no funding for our research.

Author information

Affiliations

Authors

Contributions

All authors contributed to the concept and design. They were jointly responsible for the elaboration and revision of the article. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Melanie Heldt.

Ethics declarations

Ethics approval and consent to paricipate

We rely on data from the ICILS 2013 study, which conforms to IEA ethical standards. The Australian Council for Educational Research (ACER) in Melbourne served as the international study center for ICILS, working in close cooperation with the IEA, and the national centers of participating countries.

Consent for publication

We provide our consent to publish this manuscript upon publication in the Springer open journal LSA.

Competing interests

The authors declare that they have no financial or non-financial competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Heldt, M., Massek, C., Drossel, K. et al. The relationship between differences in students’ computer and information literacy and response times: an analysis of IEA-ICILS data. Large-scale Assess Educ 8, 12 (2020). https://doi.org/10.1186/s40536-020-00090-1

Download citation

Keywords

  • Response times
  • Student background characteristics
  • Computer and information literacy
  • Process data