Skip to main content

An IERI – International Educational Research Institute Journal

Students’ and teachers’ perceptions of students’ academic outcomes in Slovenia: evidence from REDS data

Abstract

The COVID-19 pandemic brought immense challenges to global society. The entire social and work life had to be reorganized to accommodate for the restrictions imposed to limit the spread of COVID-19. These restrictions affected the eduFIGation worldwide as well. Face-to-face education was disrupted and alternatives had to be found. One of the questions raised with the disruption was the student outcomes in the times when the usual teaching and learning was disrupted by the lockdowns and remote classes took place. There are not many studies on the topic in Slovenia but, more importantly, the existing ones do not use representative data to investigate the depth of the problem. The aim of this article is to fill this gap in research through comprehensive and in-depth analyses of the Slovenian student, school and teacher data from the international Responses to Educational Disruption Survey (REDS), conducted in 2020/2021. The data analysis involves descriptive and multivariate statistical methods appropriate for the mostly categorical data available from REDS. The results show that students’ perception on their learning and academic outcomes during the school disruptions depend on their background characteristics, i.e. the disruptions did not affect all students equally. These results are also supported by the findings from school principals’ and teachers’ data. In addition, the long-lasting issue of “grade inflation” in Slovenia has become even more severe, as shown by both student and teacher data. However, students and teachers are at contrasting opinions about student learning—while most of the students think they learned more at home during the disruptions and have shown more progress, teachers are of the opposite opinion. Furthermore, teachers tended to grade students’ academic outcomes higher during the disruptions which has increased the issue of “grade inflation” in Slovenia.

Introduction

There is no doubt that the COVID-19 pandemic affected all aspects of life. Schools, teachers and students (not to forget parents or guardians), as well as the entire organization of teaching and learning, were no exception. Although the adoption of distance learning was key to ensure the continuity of education (Di Pietro et al., 2020) changing the traditional face-to-face classroom teaching and learning to online had to be achieved rather quickly and without clear guidelines for implementation (Ozamiz-Etxebarria et al., 2021). Global organizations such as UNICEF, UNESCO and many others have raised concerns throughout the COVID-19 pandemic about its negative impact on child/student education due to school closures around the world (Gustafsson, 2021; UNESCO, 2021; in Whitley et al., 2021). Physical school closure and the adoption of distance education may negatively affect students’ learning through four main channels: less time spent in learning, stress symptoms, a change in the way students interact, and a lack of learning motivation (Di Pietro et al., 2020, p. 10). All of this seems to be also inter-related. According to several studies (see Hammerstein et al., 2021), students have decreased academic outcomes due to school closure during the COVID-19 pandemic. Several arguments for learning losses during school closure (and its inter-relations) can be brought up to explain this: (1) There is evidence showing that quarantined students tended to spend less time in learning compared to when schools were open; (2) Many students confined at home due to COVID-19 felt stressed and anxious, and this could have negatively affected their ability to concentrate on schoolwork; and (3) Physical school closure and the lack of in-person contact could have made students less externally motivated to engage in learning activities (Di Pietro et al., 2020, p. 6). Different groups of students and teachers suffered to a different extent because of the COVID-19 restrictions in terms of the aforementioned issues. Each country accepted its own policy decisions on how school closures and distant learning took place. Thus, isolation and detachment from schools and friends was to a different degree in different countries (e.g., Lindblad et al., 2021; Mitescu-Manea et al., 2021).

Empirical studies on the effects of isolation due to COVID-19 in Slovenia are scarce or questionable in terms of representativeness of investigated population(s) or some other limitations, as described below. The studies from Pečjak et al., 2021, Rupnik Vec et al., 2020 and Uršič & Puklek Levpušček, 2020, for example, cannot be generalized to the whole population of school students. The Responses to Educational Disruption Study (REDS), on the other hand, is the first comprehensive international study where Slovenia participated with nationally representative samples of grade eight students and teachers. Therefore, REDS appears to be currently the most reliable study on students’ outcomes in compulsory education in Slovenia, although the study did not measure actual academic outcomes, but students’ own perceptions on them. In Slovenia, REDS data were collected in mid-February-mid March 2021 for majority of the school sample, and mid-March-early-April for the remaining schools (8th graders were target population, teachers of 8th graders in previous school year; teaching compulsory subjects during 1st wave school closure, and principals).

Backgrounds

COVID-19 and academic outcomes

Aremu and Sokan (2003) see learning outcomes (academic achievement and academic performance) determined by family, schools, society and motivation factors.

Paraphrasing this sentence, academic outcomes are influenced by several school and out-of-school (e.g. family) factors as well as student (individual) factors, however their inter-relation is also very important.

Given the disruption and shifting from face-to-face classroom teaching and learning, it can be assumed that the academic progressing of students could have decreased during schooling disruptions and school closures imposed by the COVID-19 pandemic. Some studies (see Hammerstein et al., 2021, for a systematic review of 109 studies) have found that the academic outcomes of students have decreased dramatically during school closures. However, some studies show that the effect was not equal across students depending on their socioeconomic status (SES). In some cases the studies found that lower-SES students actually benefited from school closures and using remote online classrooms, but in most cases it was the other way around (Hammerstein et al., 2021). Patrinos et al., (2022a, 2022b) performed an analysis based on 35 rigorous studies from 20 countries and discovered that: (1) Most studies (32) found evidence of learning loss; (2) Different levels of learning loss by student SES, past academic learning, and the subject of learning were consistent; and (3) The longer the schools remained closed, the greater were the learning losses.

Some studies also showed gender differences during remote learning. The results from a study in Finish schools found lower-secondary boys to be slightly more efficient in learning at home during remote learning periods compared to girls (Oinas et al., 2022). This can be explained by male students’ better behavioral strategies to cope with their disorientation during online learning (Wu & Cheng, 2019). However, females were more self-regulated in the preparation phase for online learning compared to males (Liu et al., 2021).

Several dimensions were investigated and addressed in different studies during COVID-19, among them also academic outcomes and within this dimension, especially learning loses during/after school closures because of COVID-19 pandemic. But there is some evidence regarding academic outcomes in the sense of learning losses even before COVID-19 pandemic. Various studies dating from the beginning of 1906 all the way to the present time show that the summer break (sometimes referred to as summer setback, or summer slide) may disturb the daily rhythm of learning and may even lead to losses in knowledge and skills (Cooper et al., 1996; Paechter et al., 2015; Quinn and Polikoff 2017; Cooper, n.d.). The results of those studies show two consistent findings about summer losses: (1) Students generally gain academic skills/knowledge at a slower pace over the summer compared to the school year; and (2) Summer loss, at least in some subject areas, may be especially large for less advantaged students (Phillips & Chin, 2004).

In contrast to school closures due to summer vacation and teacher strikes, in distance-learning times schools and teachers aim to help students progress in their learning, even though their means are limited (Blaskó et al., 2022). While the literature stresses the importance of different resources for predicting academic outcomes in normal times, resource distribution across the student population gain a special importance for learning progress during physical school closures when online learning is restricted to a student’s home environment (Blaskó et al., 2022) Therefore, home-related factors may still have a great role in investigating learning losses, whether this is during school closures relating to summer vacations or school closures and remote schooling during COVID-19 school closures.

The context of Slovenia

Limitations of national examinations for trend analysis

Slovenia was one of the countries with big restrictions on work and social life with students spending most of their time in 2020 learning from home. The first COVID-19-related school closure in Slovenia (March 16, 2020) for all education levels was followed by several school closures later (they were different in duration for different grades in compulsory education). However, it is not known how profound were the problems with the difficulties schools and teachers experienced in teaching and learning and with the trends in student academic outcomes (increase or decline). The national assessment in the school year 2019/2020 which had to be conducted in May 2020 did not take place, but was resumed for the school year 2020–2021, in May 2021. Compared to the academic outcomes from 2016 to 2019 in Slovene language and Mathematics, no noticeable decline has been seen (Cankar and Rakinić, 2022). However, there are several important limitations to be mentioned when interpreting those results from comparative (trend) perspective: (1) The comparability of results with previous years (not the same items, not the same generations, small number of repeated items—even not for all tested subjects) is actually not possible; (2) Results were not for all school subjects (compulsory in language of instruction and mathematics [for both grades] and foreign language in grade 6 or third subject chosen by the minister in grade 9; the third subject differed among schools)—national examinations results are limited also from the perspective that knowledge from all school subjects were not examined. Therefore, national examinations cannot give a proper insight into comparing student knowledge in these two periods, before and during/after school closures.

Assessing/grading student knowledge

Another insight into comparing student’s knowledge between two points could be made from the perspective of academic outcomes related to specific subjects and their grades. Grading student knowledge has several very important purposes and functions, and those haven’t just disappeared during the assessment of academic outcomes of students during school closures and remote teaching and learning. Based on Strmčnik (2001) and Žveglič Mihelič and Vogrinc 2023, pp. 122–123): (1) The informative function of assessment is to inform students and their parents about the student level of achieving educational goals, as well as this function (together with so-called practice assessment (literally “verification”) which must precede every formally graded assessment) allows teachers real-time insight into their own teaching and allows for planning changes to improve teaching; (2) The selective function of assessment is in the classification of students on the basis of academic outcomes (obtained grades). This gives students a more realistic idea of their own learning situation, their abilities and interests and more often allows them to make better decisions about continuing schooling; (3) The pedagogical and motivational function of assessment is to motivate the student for further learning and optimal attainment of educational goals; and (4) The repressive function of assessment is to discipline students in the classroom, to punish them students for inappropriate behavior or to force students to learn more and better, following the rules and regularly performing obligations (ibid.). As can be seen, grading students’ knowledge has several important functions and all of them contribute to teaching and learning in the future. One could assume that this feedback information, especially when taking into consideration its motivational function, can be understood as a very important factor when thinking about perception on students’ academic outcomes. As studies have acknowledged, a student’s self-concept is what determines the levels of academic outcome, and self-perception in turn can be powerfully influenced by contingencies provided by significant others (Mathew, 2017, p. 5). Among the significant others, a teacher’s role, as suggested in the Pygmalion principle, should not be undervalued (ibid.) as it shows that teacher expectations influence student learning outcome—positive expectations influence academic outcome positively, and negative expectations influence outcome negatively, which means that expectations can modify behavior (Rosenthal & Jacobsen, 1968).

In Slovenia, teachers assess students based on objectives (and standards) written in the syllabus and the assessment procedure regulations. Teachers continually assess student outcomes in written, oral, and applied forms, as well as through written tests. Rules define how many times per year students must demonstrate their knowledge in both written and oral forms. However, these rules vary across grades and are quite stringent, leaving teachers with a limited scope for decision making and authority in terms of classroom assessment (Doupona, 2012, p. 603; Klemenčič, 2022).

The grading system is a strong institution within Slovenia’s education system, with grades being perceived as a strong motivational factor. However, there is a dispute whether earning high grades has more recently become a student’s primary goal, as opposed to gaining knowledge (ibid). This is often discussed in the media and public discussions as well when grades from different generations are compared, with regard to possible grade inflation (Klemenčič, 2022). However, this challenge goes even further back in the history, back to the 1990s by Jurman (1989), who pointed out that if a higher proportion of students get higher grades, the quality of knowledge declines. Therefore there is pressure on teachers to lower the criteria when evaluating knowledge. In essence, he was describing grade inflation. Grading student knowledge in Slovenia, especially continuous discourse presented in the media regarding high grades and constant inflation of very good or excellent grades has more than a decade history. It started with a strong discourse in the media in 2010 when the first study from Zupanc and Bren (2010) about grades in internal assessments was out and for the first time, at least known to the authors of this article, was proven with data. Grade inflation discourse, as well as the challenges in assessing and grading students’ knowledge during remote schooling, was significant also during the school disruption caused by COVID-19. Both in the media (e.g. Škerl Kramberger, 2020; Kuralt, 2020a, 2020b; Kuralt, 2022; Musić, 2022; Katalenic, 2021) and among expert opinions (e.g. Štefanc et al., 2020) this was regularly discussed. One could observe that even more often than normal, this topic was at the forefront of the media around the end of each school year. Also, the National Institute for Education prepared and circulated recommendations where some specifics about assessing and grading student’s knowledge during school closures were proposed, such as avoiding written tests, adjust the criteria for assessing knowledge, as well as assessing only those prescribed goals that were achievable, etc. (ZRSS, 2020). However, assessing and grading students’ knowledge was still on high agenda, especially when the minister responsible for education interfered. “Speaking to TV Slovenia, the minister said that they will try to find a way for the students to get a ‘good, friendly and encouraging grade’. The minister did not tell us what she meant by that.” (Kuralt, 2020). The damage was done, not only because it was not sufficiently clear what “friendly and encouraging grades” meant, the descriptor “good” in the Slovene language has very precise meaning when speaking about grades—it means grade 3 which is exactly in mean of the grading system. And all of this did not help with grade inflation, it can be said it even had the opposite effect. Therefore, this article is also interested in the perceptions of students, teachers and principals, not only regarding the knowledge (or learning losses), but also about the grading knowledge and the perceptions of that.

The present study

There is a clear contemporary research gap addressing the perceptions of students and teachers on student learning outcomes during COVID-19 in Slovenia. To fill the research gaps, this study aims to carry out comprehensive and in-depth analyses on the topic. This article aims to answer the main research question “What are the perceptions of 8th grade students, teachers and school principals on students’ academic outcomes and grading their knowledge during the school disruption caused by the COVID-19 epidemic in Slovenia?” The term “epidemic” was used in the questionnaires because it was more appropriate, due to the discussions in the media, where the word “epidemic” was used to describe the situation in Slovenia, and therefore this term is used throughout the REDS instruments in Slovenia. An important aspect of teaching and learning during the COVID-19 disruption was the assessment and grading of student knowledge. Slovenia added national options on assessment and grading to the student questionnaire in REDS, due to the long-term challenges with grading student knowledge, and especially to address, on one hand public discourses on the topic during school closures due to COVID-19, and on the other hand due to experts and policy discussions and recommendations to address this specific aspect during school closures.

Our analyses were done by different groups. This is a relevant aspect from the literature review (for SES, gender differences both addressed in the introduction, as well as school location)Footnote 1 and from our national aspect. The results from large-scale assessment for 8th graders when taking into consideration students’ academic outcomes show an association between SES and students’ achievements and fairly stable gender gaps favoring girls (for civic knowledge, computer and information literacy) except for Mathematics and Science where no statistical significant differences were found (Klemenčič, Mirazchiyski & Novak, 2019) as well as the digital divide regarding school location (and association with SES) for computer and information literacy (Mirazchiyski, 2016).

Methodology

Data

The data used in this article stems from the Responses to Educational Disruption Survey (REDS), a study conducted in 2020/2021 in collaboration between the United Nations Educational, Scientific and Cultural Organization (UNESCO) and the International Association for the Evaluation of Educational Achievement (IEA). There were 11 countries participating in the study (Meink et al, 2022). The main objective of REDS was to investigate how countries approached the challenges their educational systems faced in providing their students with school education in the difficult circumstances the COVID-19 pandemic posed, in order to provide information to policy-makers and educational leaders for decision making using an evidence-based approach. This information can be helpful in evaluating the effect of the disruption caused to education during the pandemic for students, teachers and schools. This information can be used to develop solutions to mitigate such situations in the future as well. One of the aims of REDS was also to explore which students were put at greater risk when schools were closed and the success of the implemented measures intended to mitigate the closures through remote teaching and learning (Meinck & Fraillon, 2022).

This article uses data from the students, teachers and school principals in Slovenia who participated in REDS 2021. The total number of participants is presented in Table 1. In addition to the internationally collected data, Slovenia added its own variables of national interest, some of which were used in the analyses (see the next subsection).

Table 1 Number of participating students, teachers and school principals in Slovenia

Measures

To meet the objective of this article, data from the following internationally collected variables were used.

Student

The variables from the international student questionnaire used in this study are presented Table 2. The table presents the variable names, the actual question asked and the response categories (if any) and the applied recodings (reversing and/or collapsing the categories). Two original SES variables were used from the original data—SES_IRT and SES_IRT_C. The former is a continuous scale constructed using Item Response Theory (IRT) and the latter is a categorization of the obtained variable in three categories (low, medium and high). For more information on how these two variables were constructed, please refer to the REDS User Guide (Kjeldsen, 2022).

Table 2 Student variables from the REDS international student questionnaire

Data from national variables collected in Slovenia and related to student perceptions on assessment and grading during the COVID-19 disruption were used as well. These are presented Table 3 along with their properties.

Table 3 Student variables from the REDS national student questionnaire

In addition to the individual international variables, a “Student’s capacity to perform school work during the COVID-19 disruption” scale was created. The scale is comprised of questions under a common stem (“During the COVID-19 disruption, did the following aspects of your schoolwork change?”). The variables used to compose the scale are presented in Table 4 along with their properties.

Table 4 International student-level variables used for composing the “Student capacity to perform school work during the COVID-19 disruption” scale

The student capacity scale was produced using Exploratory Factor Analysis (EFA) after recoding all variables, so that the student responses were in increasing magnitude of the responses (i.e. “Decreased”, “Did not change”, “Increased”). The reliability of the scale (Cronbach’s alpha) equals 0.78. Given that no decisions will be made on an individual level, the obtained value is satisfactory. The EFA was performed with Principal Axis Factoring (PAF) and the student final weights were applied. The scores were generated using the regression method. The factor loadings and communalities of the EFA model are presented in Table 5. The values for all factor loadings are above 0.6 and for all communalities are above 0.4 which is satisfactory. There is one factor extracted which accounts for 47% of the variance in the data and is much higher than all other possible factors, as can be seen in Fig. 1. The correlation between the factor and the regression scores is r = 0.89 which is quite strong.

Table 5 Factor loadings and communalities for the EFA model of the “Student capacity to perform school work during the COVID-19 disruption” scale
Fig. 1
figure 1

Factors identified from the EFA model for the “Student capacity to perform school work during the COVID-19 disruption” scale

Teacher

The variables from the international teacher questionnaire used in this study are presented Table 6.

Table 6 Teacher variables from the REDS international teacher questionnaire

Besides the international variables, national variables were used as well. These are presented in Table 7.

Table 7 Teacher variables from the REDS national teacher questionnaire

Please note that the nationally adapted variable on the subject the teachers teach deviate from the international one, as the disciplines do not match exactly to the international option due to the specificity of the educational system. There are many different subject areas that teachers teach. For the purpose of this article, the compulsory subjects were collapsed in three different categories and used in the analyses in the following response categories:

  • Humanities/Social Sciences (languages—Slovene, foreign/other languages, Social Sciences subjects, and Creative Arts);

  • Mathematics, Science (general and/or Physics, Chemistry, etc.), technology;

  • Sports.

School

School-level variables from the questionnaire for principals used in this study are presented Table 8.

Table 8 School variables from the REDS international school questionnaire

Besides the international variables, this study also uses a nationally adapted variable (NP1G34), school location which had different categories from the international option due to the small size of the settlements in Slovenia. The categories are as follows:

  • A settlement with less than 3000 inhabitants;

  • A settlement with more than 3000 inhabitants but less than 15,000 inhabitants;

  • A settlement more than 15,000 inhabitants but less than 100,000 inhabitants;

  • A settlement with more than 100,000 inhabitants.

In addition, a variable (TOTALNSTUD) on the total number of target grade students in the school summing up the number of female (IP1G32A) and male (IP1G32B) target grade students at school was created.

Analysis

REDS uses complex sampling—two-stage multistage stratified cluster sampling with the probability of selection proportional to the size (i.e. number of target-grade students) of the primary sampling units (schools) (Meink et al, 2022). In Slovenia intact class sampling within the sampled schools was used (8th grade for students, teachers teaching this sample of students during school closure). School, teacher and student samples are representative of the countries’ populations for schools having students in the target population (grade 8 in Slovenia). Sampling weights were calculated for each respondent as the inverse of the selection probability and non-response adjustments were used. Analyzing data using basic statistical methods assuming simple random sampling would lead to biased estimates. Analyzing REDS’ data requires using resampling techniques. The Jackknife Repeated Replication (JRR) is used for computing the standard errors of the estimates (Meink et al, 2022). All analyses were performed through the R Analyzer for Large-Scale Assessments (RALSA) (Mirazchiyski, 2021) which can work with data from studies with such a design.

The analyses use separate student, teacher and school datasets from Slovenia. In addition, where the analyses need to use combined responses (e.g. student opinion on their learning progress by school location), the datasets were merged, i.e. student data was merged with school, and school data was merged with teacher data. Unfortunately, due to the sampling design of REDS, student datasets can be merged with school datasets and school datasets can be merged with teacher datasets, but teacher datasets and student datasets cannot be merged and analyzed together. This is because the student and teacher samples were drawn separately and the sampled teachers in each school may not be the ones teaching the sampled students (see IEA, 2022 for more details).

The analyses in this paper use the variables described in the previous subsection. Initially, multiple logistic regression models were used to test the relationship of student and teacher perception on academic performance and grading as dependent variables based on different student, school and teacher characteristics and variables on teaching and learning during the disruption periods. However, none of these models showed any palpable results. The example below (Table 9) shows the results from logistic regression where the statement “I learned about as much as before the COVID-19 disruption” was dichotomized to two categories (“Disagree” and “Agree”) and used as a dependent variable in a logistic regression model with a number of student and school variables as predictors. As the table shows, the model is uninformative and the only variable that shows significant coefficient is the student capacity to perform school work scale, but the coefficient itself is rather low. The explained variance (Hosmer and Lemeshow) is just 5%.

Table 9 Results from logistic regression with “I learned about as much as before the COVID-19 disruption” as a dependent variable

This is why the analysis methods used in this study are descriptive statistics, cross-tabulations, Spearman correlation (to test the association between variables, as almost all of the variables in the REDS database are categorical) and linear regression with contrast coding to test the difference between female and male students on the newly constructed “Student capacity to perform school work during the COVID-19 disruption” scale. The cross-tabulations also include a chi-square test of independence between the variables. A known problem with chi-square when using clustered data is that the statistic is biased. To overcome this issue, first- and second-order Rao-Scott adjustments of the chi-square statistic was applied. These two adjustments have been proven to be unbiased when compared to other methods (see Skinner, 2019).

Results and discussion

Student analyses

During the COVID-19 disruption, all students in Slovenia had to live and study from home, attending classes online. It would be expected that when teaching was online this lead to lower learning outcomes. However, some researchers (see Hammerstein et al., 2021, for example) have found that depending on some of their characteristics, some students may actually benefit from online teaching and learning.

The overall results of the student opinion on their learning outcomes during the COVID-19 disruption period are presented in Table 10, comparing the percentages of students who agree or disagree they learned more from home compared to when learning at school.As the table shows, the majority of students (73.23%) strongly disagree or disagree with the statement. However, more than a quarter (26.77%) still believe (agree or strongly agree) that they learned more at home than attending regular classes at school. The next analyses test the relationship between this variable and student gender, SES and the location of the school they attend.

Table 10 Overall student responses to the “I learned more studying at home than when attending regular lessons at school” in Slovenia

Figure 2 displays the heat plot from a two-way table between student gender (rows) and their agreement or disagreement that they learned more at home than at school (columns). The majority of both female and male students disagree with this statement, although the number of male students less so. Also, a large number of male students still agree they learned more at home and this number is much higher compared to female students. The relationship between the two variables (student gender and their agreement they learned more at home) is statistically significant (χ2(Rao-Scott 1) = 19.902, p < 0.05; χ2(Rao-Scott 2) = 210.145, p < 0.05)Footnote 2 This is in line with the results from a study in Finnish schools where lower-secondary boys were found to be slightly more efficient in learning at home during remote learning periods compared to girls (Oinas et al., 2022). This can be explained by male students’ better behavioral strategies to cope with their disorientation during online learning (Wu & Cheng, 2019). This can be essential for educators when they design different implementations of online learning, taking into consideration the gender differences. An alternative interpretation, however, is also possible. In many countries female students have shown to have much lower self-concept in mathematics (see Mejía-Rodríguez et al., 2021), for example, even if they outperform male students on the subject. It can also be that the male students in Slovenia just have higher self-concept in regard to their learning compared to the female ones and they overestimated how much they have learned during the disruptions. Unfortunately, REDS does not provide any data on student achievement to test this hypothesis.

Fig. 2
figure 2

Heat plot of frequencies for boys and girls answering the item "I have learnt more at home then when attending regular lessons at school"

Similarly, Fig. 3 shows the heat plot from a two-way table between the categorized student family SES and how much students agree they learned more from home compared to learning at school. The number of low-SES students in Slovenia is rather low, less than 7%. Thus, the first two categories (“Low” and “Medium”) were collapsed together. This way, the number of students with low- to medium-SES, and high SES are approximately equal, 51% and 49% respectively. As Fig. 3 shows, students from both low- to medium-SES and high-SES mostly disagree they learned more at home than in regular classes at school. However, the number of low- to medium-SES students disagreeing is lower. Instead, the number of students in this group is higher for the “Strongly disagree” category, but their number agreeing they learned more at home is higher for the agreement category, compared to the students from high-SES families. The relationship between the two variables is statistically significant (χ2(Rao-Scott 1) = 12.340, p < 0.05; χ2(Rao-Scott 2) = 211.173, p < 0.05). This has been found from other researchers as well. For example, Hammerstein et al. (2021). This is unexpected as other authors (Jæger & Blaabæk, 2020) find that higher-SES students receive more academic support from their parents, physical resources and motivational support compared to students from lower-SES families. In addition, these results are unexpected because lower-SES students in Slovenia are more concerned with their future academic success and lagging behind in learning compared to higher SES-students (Klemenčič et al., 2022).

Fig. 3
figure 3

Heat plot of frequencies of students at different SES levels answering the item “I learned more studying at home than when attending regular lessons at school”

The relationship between the student opinion on learning more at home compared to regular lessons at school and the location of the school (i.e. the size of the community) was tested. The majority of students tend to disagree with this statement, regardless of school location. The largest number of students who disagree is from areas between 3000 and 15,000 students. However, a large number of students whose school is located in these areas also tend to agree or strongly agree that they learned more from home than the regular classes. This is also true for students from even smaller communities (less than 3,000 inhabitants). The relationship between these two variables, however, is statistically insignificant (χ2(Rao-Scott 1) = 11.151, p = 0.360; χ2(Rao-Scott 2) = 533.892, p = 0.363).

An important question is how well students perceived their capacity to perform the school work during COVID-19 given the circumstances of the situation. For the purpose of addressing this question, a “Student’s capacity to perform school work during the COVID-19 disruption” scale was created using a set of variables (see Table 4) from the student questionnaire on the changes in the students’ schoolwork during this period using Exploratory Factor Analysis (EFA). The correlation of this scale with student family SES is r = − 0.03. Although this correlation coefficient is negative (higher SES students tend to evaluate their capacity lower during the disruption), it is very close to zero and is not significant. That is, the SES is not related to the students’ capacity to perform the schoolwork during the disruption—the consequences of the disruption did not affect students differently depending on their SES. A simple linear regression model (not included here) with dummy coding of the categorized SES index (low, medium and high) was tested as well and it did not find statistical differences between the three groups of students by their SES.

The differences in student gender in the capacity were tested for the full scale (see above) and for the changes in the quality of their school work. For the full scale, the difference between female and male students favoring boys is statistically significant (p < 0.05), but rather small (1.05 points on a scale with center point of 50 and standard deviation of 10 points). The students’ opinions on the changes in quality of their school work and student gender was used along with their gender in a two-way table. The heat plot from the table is presented in Fig. 4. As the figure shows, most of female and male students are of the opinion that the quality of their work did not change. However, there are more female students who are of the opinion that the academic outcomes for students from low-income families stated that the quality of their work decreased compared to male students. On the contrary, there are more male students who are of the opinion that their work actually increased during the COVID-19 increased, compared to female students. The relationship between the two variables (student gender and changes in the quality of their schoolwork) is statistically significant (χ2(Rao-Scott 1) = 8.762, p < 0.05; χ2(Rao-Scott 2) = 147.245, p < 0.05). The “quality of education”, however is a vaguely defined term which is often used, but also misused. Since this was only one variable in the questionnaire and reflects often misused terminology because it has a positive connotation per se, we cannot be sure that the students really understood any specifics behind the concept of “quality of school work”.

Fig. 4
figure 4

Heat plot of the frequencies of boys and girls on changes in quality of their schoolwork during COVID-19 disruption

Similarly, school location variable was tested against students’ opinion on the quality of their school work during the COVID-19 disruption. Most of the students, regardless from the size of the settlement their school is located, are of the opinion that the quality of their work did not change during the COVID-19 disruption. It has to be noted that the number of students who think that the quality of their schoolwork increased are from schools located in settlements with less than 3,000 people and settlements between 3000 and 15,000 people. The majority of students from larger settlements are also mostly of the opinion that the quality of their work did not change and the number of these who think it decreased or increased is more than twice smaller. The relationship between these two variables is, however, not statistically significant (χ2(Rao-Scott 1) = 1.612, p = 0.982; χ2(Rao-Scott 2) = 364.132, p = 0.968).

The outcomes from the remote teaching and learning can be compared with those from before the disruption Table 11 presents the student responses on the perceptions of their learning and progress during the COVID-19 disruption compared to the period before. As the table shows, more than half of the 8th graders in Slovenia are actually of the opinion that they have learned more during the disruption period compared to the period before (53.38%) with more than 14% strongly agreeing with this statement. As for the progress in some subjects, 65% are of the opinion that it was more during the COVID-19 disruption with almost 20% agreeing strongly with the statement. These results, however, contradict the teachers’ opinion on student learning and the amount of work delivered by the students (see Table 12 in the next sub-section). That is, while most of students in Slovenia are of the opinion that they made more progress in some subjects and their learning was unaffected by the disruption, the teachers are of the opposite opinion. At the same time, teachers admitted that they gave higher grades to their students (Table 14 in the next subsection) which contributed to the issue of grade inflation in Slovenia during the COVID-19 closures and added to the general issue of grade inflation which was noticed long before the COVID-19 disruptions were investigated by Zupanc and Bren (2010).

Table 11 Frequencies of student answers on their learning and progress during COVID-19 disruption compared to the previous period
Table 12 Percentages of student agreement on questions regarding their perceptions on assessment and grading during the COVID-19 disruption

The relationship between these two variables (learning and progressing on some subjects more during COVID-19 disruption than before) with student gender, family SES and school location was tested to identify which students think they benefited more. Figure 5 shows the heat plot from the two-way table between student gender and student opinion on their learning during COVID-19 compared to the previous period. As the figure shows, female students tend to disagree a bit more than male students that they learned more during the disruption. On the contrary, male students tend to agree more that they learned more during the disruption. These two variables are significantly related (χ2(Rao-Scott 1) = 33.104, p < 0.05; χ2(Rao-Scott 2) = 211.116, p < 0.05). Student opinion on their progress by their gender showed that both female and male students mostly agree that they made more progress during the disruption, but no significant relationship was found between the two variables, although male students show higher levels of agreement. No relationship was found between student opinion on their progress during the COVID-19 disruption and the family SES and school location. This is in line with the results of student agreement that they learned more while studying at home than at school (Fig. 8).

Fig. 5
figure 5

Heat plot of frequencies of boys and girls answering the question “I learned about as much as before the COVID-19 disruption”

Table 12 shows the percentages of students on the different school assessment and grading aspects during the disruption period. The table shows interesting results. More than a quarter (28.17%) of the 8th graders in Slovenia tend to be of the opinion they did not need to have the same knowledge to obtain the same grades as before the disruption. Also, nearly a third (32.29%) tend to share the opinion that teachers were not able to control for using unauthorized materials during the assessment of knowledge. Most of the students (58.37%) tend to share the opinion that the help their parents provided in studying did not affect their grades. Still, nearly 42% tend to agree or strongly agree their parents’ help influenced their grades at school. More than a quarter of the students also tend to think that teachers did not manage to carry out lessons well enough to assess their knowledge. More than a half of the students (nearly 57%) tend to agree or strongly agree they feel that teachers were giving them higher grades compared to the period before the school disruption.

Two of the most important statements related with assessment and grading are (1) about the amount of knowledge needed for the grades during the first wave of the epidemic; and (2) feeling that teachers gave higher grades during the closure compared to the period before this (the first and the last statements Table 12). The heat plots for these two variables from two-way tables with the categorized SES and school location are presented in the figures below. As Fig. 6 shows, students from low- to medium-SES families tend to agree more that they needed to have the same good knowledge compared to high-SES students whose responses are more scattered across the response categories. The relationship between the two variables is statistically significant (χ2(Rao-Scott 1) = 18.867, p < 0.05; χ2(Rao-Scott 2) = 5.439, p < 0.05).

Fig. 6
figure 6

Heat plot of frequencies low to medium SES students responding to the question “For my grades during the first closure I had to have the same good knowledge as otherwise”

The heat plot from the two-way table between SES and how much students feel teachers gave them higher grades during the disruption than otherwise is presented in Fig. 7. Students from higher-SES families tend to agree more with the statement while the responses from the lower- to medium-SES families are more scattered. The relationship between the two variables is statistically significant (χ2(Rao-Scott 1) = 10.543, p < 0.05; χ2(Rao-Scott 2) = 2.953, p < 0.05). It seems that the COVID-19 disruption increased the grade inflation – higher-SES students are more of the opinion that teachers gave them higher grades for their knowledge than lower- and medium-SES students.

Fig. 7
figure 7

Heat plot of frequencies low to medium SES students responding to the question “I have a feeling that the teachers gave us higher grades than they would have otherwise if the school had not been closed”

The relationship between school location and the two variables of needing to have the same good knowledge to get the same grades and having higher grades during the COVID-19 disruption did not reveal any significant relationship.

Results in this section have shown that not a majority but still a large proportion of students (more than a quarter) think that they learn more at home than they would in regular classes. Indeed, more boys than girls think so. Contrary to other studies focusing on learning losses during the COVID-19 disruption, this study found that students from lower- to medium-SES on average are of the opinion that they learned more. The results from the analysis on school location and the amount being learned are not statistically significant. The capacity of students to perform school work (self-evaluation) did not show a significant relationship with family SES and school location. It did show a significant gender difference favoring boys, but this was rather small. In general, more boys are of the opinion that the quality of work did not change during the pandemic disruption, but there are also more boys than girls of the opinion it actually increased. Results from the students’ perspective for assessing and grading their knowledge were interesting as more than half of the students felt that teachers gave them higher grades compared to the period before the school disruption. Students from higher-SES families tended to agree more with this statement compared to others. Gender did not show any statistically significant results. Nearly 42% are of the opinion that the grades were significantly influenced by how much their parents or guardians helped them.

School analyses

The school principals were asked the question about their opinion on lasting impact of the COVID-19 disruption on academic outcomes of students in their schools. The question was asked to school principals separately for all students in the school, for lower-achieving, for higher-achieving, and students from low-income families. The percentages of school principals’ responses for different groups of students are presented in Table 13. A total of 75% of principals are of the opinion that the disruption will have decreased the academic outcomes of all students to some degree and 10% are of the opinion that the outcomes decreased to a substantial degree. Only about 2% of the principals responded that the outcomes will have increased to some or substantial degree for all students in general. The results of principals’ opinion for different other groups varied. According to the principals, for more than half (51%) of the low-achieving students, the academic outcomes decreased substantially, and 38% are that they decreased to some degree, a total of 89%. While 67% of the principals are of the opinion that academic outcomes do not change for the high-achieving students, 22% are of the opinion that they decreased to some extent and 1% that they decreased substantially. The results for the students with low-income backgrounds are rather pessimistic, as for the low-achieving students. A total of 66% of the Slovenian school principals are of the opinion that the academic outcomes for students from low-income families will have decreased to some or a substantial degree. Along these lines, studies from the United Kingdom, Belgium, The Netherlands and Catalonia (Spain) have already shown that learning losses have been widespread and especially intense among disadvantaged students (Gonzalez and Bonal, 2021). Students from less advantaged homes are more likely to experience a greater decline in learning than the ones from less disadvantaged families which means that this could lead to a wider socioeconomic gap in student learning outcomes (Di Pietro et al., 2020). This wide-spreading gap could turn from a short-turn into a long-term gap and even grow over time with consequences on future educational outcomes and on the labor market (Di Pietro et al., 2020).

Table 13 Frequencies of the extent to school principals believe COVID-19 disruption will have lasting impact on students’ academic outcomes

Subsequent analyses on the relationship between the variables on the impact of the COVID-19 disruption for these groups of students by the categories of other school variables (the size of the school [i.e. number of students], family SES and school location) did not reveal any strong or significant relationships. That is, there were no factors that could explain principals’ opinions on the impact for these groups of students. There are, however, two exceptions. First, the percentage of students from socioeconomically disadvantaged homes at school is related to the impact on the academic outcomes of high-achieving students. The Spearman correlation coefficient between these two variables is r = − 0.27 (p < 0.05), i.e. as the number of students from socioeconomically disadvantaged homes increase, the principals’ opinions on students’ academic outcomes of high-achieving students tends to lean towards the expectation of decreasing academic outcomes. Second, the percentage of students from socioeconomically disadvantaged families at school correlates negatively and significantly (r = − 0.21, p < 0.05) with the impact on academic outcomes for students with low-income families.

Teacher analyses

All the school work during the school disruption due to the pandemic was done online. Teachers were the ones working directly with students in class remotely and had a first-hand view of student learning progress during the disruption, but also after the remote learning period ended. This section presents the results of the teachers’ views of student learning growth during the disruption compared to the period before, the amount of schoolwork accomplished by students, as well as the overall progress after the COVID-19 disruption.

Figure 8 presents the results of teachers’ perceptions on whether their students have shown the same rate of learning growth during the COVID-19 disruption as before. As the figure shows, most of the teachers strongly disagree (19%) or disagree (62%) with a total of 81%. Only 18% agree that students have progressed with the same rate and just over 1% strongly agree the learning growth of their students was the same as during the period before the disruption.

Fig. 8
figure 8

Frequencies of teacher responses on the statement “My students have shown the same rate of learning growth as before the COVID-19”

Table 14 presents the results of teachers’ opinions on student learning and the amount of school work students were able to produce during the disruption compared to the period before. The results in both columns are quite similar—for both of the statements a quarter or more of the teachers are of the opinion that they have substantially decreased during the disruption. More than a half (55% and 54% respectively) are of the opinion that student learning and the amount of work have decreased to some extent. The share of teachers who are of the opinion that learning and amount of work of their students increased or substantially increased is below 6%. As stated earlier, these results contradict the student opinion on their progress and the amount of learning during the COVID-19 disruption (Table 11).

Table 14 Changes in aspects of student learning during the disruption compared to the previous period

The teacher questionnaire also has a question on student progress as expected at the same time the same year following the COVID-19 disruption. The results are presented in Fig. 9, showing that the majority of teachers (87%) agree or strongly agree. However, 13% disagree or strongly disagree, i.e. they are of the opinion that student progress during the disruption is the same as they would have expected at this time of the school year, as if there was no school disruption.

Fig. 9
figure 9

Frequencies of teacher responses on the statement “Students had not progressed to the extent that I would have normally expected at this time of year”

It is expected that these differences depend on different characteristics of the schools where teachers work, as well as different teacher characteristics. The teacher gender is unrelated with any of the variables from above. School location is unrelated with any of the variables, except for the statement that students have not progressed as expected at this time of the year. The results are presented in the heat plot from the two-way table between the variables Fig. 10. As can be seen, most of the teachers agreeing or strongly agreeing that students had not progressed as expected for this time of the school year are from schools in settlements with 3000 to less than 15,000 inhabitants, followed by teachers from settlements with less than 3000 inhabitants. The relationship is statistically significant (χ2(Rao-Scott 1) = 24.644, p < 0.05; χ2(Rao-Scott 2) = 477.918, p < 0.05).

Fig. 10
figure 10

Heat plot of the frequencies of responses by teachers in different school locations on the statement “Students had not progressed to the extent that I would have normally expected at this time of year”

It can be expected that teachers teaching different subjects would have different opinion on student learning growth, learning in general, the amount of school work during the disruption, as well as on the expected progress following the disruption.

The analyses that follow use the recoded subject the teachers teach and each of the variables on the impact the COVID-19 disruption had on student learning (learning growth, learning in general, the amount of school work during the disruption, as well as on the expected progress following the disruption). These results are presented on the heat plots from two-way tables in the following figures.

As Fig. 11 shows, the teachers teaching social sciences/humanities are the ones who mostly disagree or strongly disagree with the statements students have shown the same learning growth during the pandemic. The relationship is statistically significant (χ2(Rao-Scott 1) = 37.066, p < 0.05; χ2(Rao-Scott 2) = 4.237, p < 0.05).

Fig. 11
figure 11

Heat plot of frequencies of teachers teaching different subject on the question “My students have shown the same rate of learning growth as before the COVID-19 disruption”

Similarly, teachers teaching social sciences/humanities are the ones who mostly find that their students’ learning has substantially or to some degree decreased (Fig. 12). The relationship between the subjects the teachers taught and their perception on the increase/decrease in student learning is statistically significant (χ2(Rao-Scott 1) = 20.192, p < 0.05; χ2(Rao-Scott 2) = 2.331, p < 0.05).

Fig. 12
figure 12

Heat plot of the frequencies of teachers teaching different subjects on the question if student learning has changed during the COVID-19 disruption compared to the period before the disruption

This is very similar to the relationship between the group of subjects taught and the next variable (the amount of school work students produced), the heat plot (not published here) looks almost identical to the previous one and the relationship between the two variables is statistically significant (χ2(Rao-Scott 1) = 27.972, p < 0.05; χ2(Rao-Scott 2) = 3.049, p < 0.05). According to Patrinos et al., (2022a, 2022b) the studies that were done during the COVID-19 disruptions showed that the learning losses in Mathematics outperform the ones related to reading losses. However, the data used in this article is grouped and does not provide insights in every single subject.

The relationship between the group of subjects taught and the level of teacher agreement on the students’ progress during the COVID-19 disruption compared to the previous period is not statistically significant (χ2(Rao-Scott 1) = 6.701, p > 0.05; χ2(Rao-Scott 2) = 0.863, p > 0.05).

The last teacher-level analyses are the ones related to assessing and grading student knowledge. There are three national variables Slovenia added to the teacher questionnaire (see the Methods section). The percentages for the separate categories from these variables are presented Table 15.

Table 15 Frequencies of teacher responses on questions related to assessment and grading during the school disruption

While nearly 42% of students tend to agree or strongly agree that their grades were significantly affected by the amount of help by their parents (Table 12), a total of 81% of teachers tend to share this opinion about their students’ grades.

It can be expected that there may be changes in the way and criteria teachers were grading their students. One of the national questions asked target grade teachers in Slovenia if their grades were higher in order to avoid alterations and inspections from authorities. It is rather surprising that nearly half (47%) of the Slovenian teachers agree or strongly agree with this statement. At the same time, teachers received a clear message from the minister of education that grades shall be “good, friendly and encouraging” (Kuralt, 2020) and the National Institute of Education’s circulated recommendations were to adjust the criteria for assessing student knowledge (ZRSS, 2020). Nevertheless, 60% of the teachers state they adhered to the same criteria grading students as before the closure. Two questions then arise: (1) How do these recommendations help solving the problem of grade inflation year by year; and (2) Do the teachers who tend to disagree more and give higher grades to avoid altercations and inspection also tend to agree more that they adhered to the same criteria and vice versa? The heat plot on Fig. 13 displays the relationship between these two variablesFootnote 3 As can be seen, the majority of the responses is clustered around the middle of the plot. The majority of responses are given by teachers who tended to disagree that they graded students higher and at the same time somewhat agree they kept the same grading criteria. Note the diagonal pattern—those who disagree more were the same that scored higher and also agree more that they used the same grading criteria.

Fig. 13
figure 13

Heat plot of the frequencies of teacher responses on statements related to assessment and grading student knowledge during the disruption (“When assessing knowledge, I leaned towards higher grades because I wanted to avoid conflict situations and possible inspection procedures” and “I assessed the demonstrated knowledge of students according to the same criteria as before the closure of schools”)

Although the pattern is not very clear (i.e. there are teachers who tend to agree on both variables) the relationship is statistically significant (χ2(Rao-Scott 1) = 162.624, p < 0.05; χ2(Rao-Scott 2) = 12.313, p < 0.05). This is also supported by the Spearman correlation between these two variables (r = − 0.21, p < 0.05), although the coefficient is very low. The association was tested by groups of subject and school location where the relationship was significant in all groups defined by the categories of these variables.

To sum up, the teachers’ views on students’ learning growth during the COVID-19 disruption is rather pessimistic, as more than 80% of the teachers tend to strongly disagree or disagree that their students show the same rate of learning as before the pandemic. In general, nearly 90% of Slovenian teachers also don’t see student progress as they would expect at this time of the school year, and teachers from smaller settlements are more inclined to have this opinion compared to teachers from larger cities. Teachers teaching Humanities/Social Sciences tend to be more of the opinion that the students they teach have lower learning growth, their learning has substantially decreased and the amount of school work they produced is lower compared to teachers teaching Mathematics, Sciences (general and/or Physics, Chemistry, etc.) and Technology. Student and teacher opinion do differ on the perceptions of the assessment of student knowledge and grading. Teachers tend to agree much more that they gave students higher grades. There is, however, some discrepancy in teacher opinion about the grades where a large number of teachers stated that they did give higher grades to avoid altercations, but at the same time state that they followed the same grading criteria. A total of 81% of teachers reported that student grades were significantly influenced by how much and what kind of help their parents/guardians offered them. This also shows how important family support was during the remote schooling for learning outcomes.

Conclusions

The main research question of this article was “What are the perceptions of 8th grade students, teachers and school principals on students’ learning outcomes and grading their knowledge during the school disruption caused by the COVID-19 epidemic in Slovenia?”.

In conclusion, students are more inclined to see their learning during the COVID-19 disruption more positively, while school principals and teachers tend to see student learning, progress and the impact on student learning, progress and growth more negatively. A very small proportion of teachers and school principals tend to see student academic outcomes during remote learning more positively. In schools with a higher number of students from economically disadvantaged homes both students from high and low families are expected to have decreasing academic outcomes. Family SES, student gender and school location have in most cases affected the student perception on their learning outcomes during the COVID-19 school disruptions. The circumstances of the COVID-19 disruption affected the way teachers assess and grade student knowledge. Both students and teachers see that parental help with studying had an effect on student grades and admitted that grades were higher than they would be for the same knowledge if learning would take place in regular classes. And this raises special concern knowing that Slovenia has been faced with evident grade inflation. The importance and relevance of the REDS study, and especially the secondary analysis from the database, including international and national variables can be seen twofold: international importance to the field of education during specific circumstances and national importance for future policy-making. From an international point of view, comparative studies as usual can provide a very important view on the topic that is broadly relevant, and the REDS is no exception. During the biggest disruption and school closures it was proven that some characteristics of particular school systems can be comparative across countries and that learning from good practices is necessary whilst taking different contexts into account. It was also proven that sometimes it is worth learning from countries that are not usually compared to or deemed necessary to compare with. In the case of remote schooling, out-of-school characteristics (e.g. home environment) seem to be very important. Our analyses indicate that Slovenian results can be internationally relevant for countries with a similar school and out of school compositions which did not take part in the REDS study. Those results can be compared to similar studies in other countries with special attention being paid to different groupings for analyses, e.g. collapsing two categories of SES together, for Slovenia this gives better estimates because of the very low proportion of students in the lowest SES category—which means investigating inequalities in education should reflect national circumstances too. Another important aspect when investigating the academic outcomes for which we have results from large-scale assessments globally perception on academic outcomes is equally important. Indeed, comparing those perceptions across different groups (e.g. by students’ gender, SES, school location, etc.) can help to take adequate measures to narrow the gaps between different sets of inequalities. The pandemic, hopefully, teaches us that the importance of education is universal, and that some elements of different educational systems and how they address this across countries can be substantially relevant. From a national point of view, focusing on students’ academic outcomes (not grading students’ knowledge but more on acquiring knowledge and several relevant skills or even providing appropriate well-being of all included in the education) and which is in forefront of this article, the relevance of the REDS study and secondary analysis, we see as good data-driven base for future policy-making in education. In this respect it needs to be noted, that our results shows that grading students’ knowledge during the epidemic caused even more challenges during remote teaching and learning which in the future will not just disappear, especially because this is one of the challenges for last two decades already. The last Slovenian school reform (the more comprehensive, not just small revisions of syllabuses) was in 1995. Different international-large scale student assessments in the past, as well as REDS study are for sure showing the necessity for more comprehensive school reform that will address current and future learners need. Actually, it was recently announced that before the end of 2023, school reform directions will be prepared and from political discussions it is clear that also grading learning outcomes is one of the issues that needs to be addressed in this reform. The school disruption brings to the surface also other relevant aspects associated with student academic outcomes, among which are very important inequalities when taking into account different the background characteristics of students. Different aspects of our paper could be of high importance, because the REDS was first and at the moment the only one, based on our knowledge, study in Slovenia which is representative for targeted populations when focusing on investigating different aspects of school closure or remote teaching during the COVID-19 pandemic, as well as investigating views on schools (learning, teaching, well-being, etc.) when schools were re-opened. Those aspects are just in discussion for future comprehensive education reform and REDS, together with other large-scale student assessments results can provide data-driven reform. Especially as we show in this paper that national examinations are very limited to give those answers relating to trends in students’ outcomes. In addition, and this is relevant for the national and international point of view, the results show how important it is to combine results from several international assessments and studies, and that attitudinal questions (in assessments or comparative studies) are also of great importance. For example, the results of this paper show that both students and teachers agreed (although to a different extent) that help from parents/guardians was important for students’ grades. As we know from some previous studies—for 4th graders in PIRLS (reading literacy) and TIMSS (Mathematics and Science) a large part of the parents/guardians (around one third) reported that they check every day to see if their child did their homework correctly (Klemenčič, 2019). This also means that during school disruption, as well as during learning in regular classes, not only is SES an important predictor of learning outcomes, but also daily inclusion of parents in the learning of their children. Indeed, this information is important to school policy-makers when designing reforms too—for regulations regarding homework (and assessments of those tasks if they are graded). The overall advice for national policy making would also be to carefully consider if, when, and for how long schools shall be closed if a future pandemic or similar extraordinary events happen because the results from this paper clearly show that home schooling does not contribute equally to learning outcomes as regular classes. On the contrary, it deepens different inequalities.

This article does not come without limitations. The main limitation is that all measures are self-reported. That is, students, teachers and school principals only report their perception on academic performance, but there is no actual data on learning outcomes in different subjects which would be more reliable and valid source on the actual outcomes. This is why constant participation in large-scale student assessments, which are investigating learning outcomes in different literacies, including different contextual factors associated between them, is even more valuable.

Availability of data and materials

The data used for the analyses in this article are publicly available and can be found on IEA’s data repository (https://www.iea.nl/data-tools/repository).

Notes

  1. Also international REDS report considered three categories of inequality: gender gaps, socioeconomic gaps, and the rural–urban divide (Strietholt & Süttmann, 2022). However our analyses on rural vs. urban divide was focused on Slovenian circumstances (described above).

  2. Due to the clustered design of the data, the regular χ2 statistic will be biased. Thus, the estimation uses the first (χ2(Rao-Scott 1)) and second (χ2(Rao-Scott 2)) order Rao-Scott adjustments. See Skinner (2009) for an overview of different methods for unbiased estimation of the χ2 statistics.

  3. Note that the categories for the two variables are in different order to allow for the testing of the assumption that those who adhered more to the same criteria would disagree that they were more generous in their grades to avoid altercations.

References

  • Aremu, A. O., & Sokan, B. O. (2003). A multi-causal evaluation of academic performance of Nigerian learners: Issues and implications for national development. University of Ibadan.

    Google Scholar 

  • Blaskó, Z., da Costa, P., & Schnepf, S. V. (2022). Learning losses and educational inequalities in Europe: Mapping the potential consequences of the COVID-19 crisis. Journal of European Social Policy, 32(4), 361–375. https://doi.org/10.1177/09589287221091687

    Article  Google Scholar 

  • Cankar, G., & Rakinić, K. (2022). Affects of interplay of distant learning and students background characteristics in time of on students national assessment results. Yerevan: ECER.

    Google Scholar 

  • Cooper, H., Nye, B., Charlton, K., Lindsay, J., & Greathouse, S. (1996). The effects of summer vacation on achievement test scores: A narrative and meta-analytic review. Review of Educational Research, 66(3), 227–268. https://doi.org/10.2307/1170523

    Article  Google Scholar 

  • Cooper, H. (n.d.). Summer Learning Loss: The Problem and Some Solutions: An overview of the concern about summer learning loss. Retrieved February 20, 2023 from https://www.ldonline.org/ld-topics/teachinginstruction/summer-learning-loss-problem-and-some-solutions

  • Di Pietro, G., Biagi, F., Costa, P., Karpiński, Z., & Mazza, J. (2020). The likely impact of COVID-19 on education: Reflections based on the existing literature and international datasets. Publications Office of the European Union Luxembourg. https://doi.org/10.2760/126686,JRC121071

    Article  Google Scholar 

  • Doupona, M. (2012). Slovenia. In I. V. S. Mullis, M. O. Martin, C. A. Minnich, K. T. Drucker, & M. A. Ragan (Eds.), PIRLS 2011 encyclopedia education policy and curriculum in reading TIMSS & PIRLS international study center. Boston: Lynch School of Education Boston College.

    Google Scholar 

  • Gonzalez, S., & Bonal, X. (2021). COVID-19 school closures and cumulative disadvantage: Assessing the learning gap in formal, informal and non-formal education. European Journal of Education, 56, 607–622. https://doi.org/10.1111/ejed.12476

    Article  Google Scholar 

  • Gustafsson M. (2021). Pandemic-related disruptions to schooling and impacts on learning proficiency indicators: A focus on the early grades. UNESCO Institute for Statistics. uis.unesco.org/sites/default/files/documents/covid-19_interruptions_to_learning_-_final.pdf

  • Hammerstein, S., König, C., Dreisörner, T., & Frey, A. (2021). Effects of COVID-19-related school closures on student achievement-A systematic review. Frontiers in Psychology, 12, 1–8. https://doi.org/10.3389/fpsyg.2021.746289

    Article  Google Scholar 

  • IEA. (2022). Responses to educational disruption survey: User guide for the international database. UNESCO & IEA.

  • Jæger, M. M., & Blaabæk, E. H. (2020). Inequality in learning opportunities during Covid-19: Evidence from library takeout. Research in Social Stratification and Mobility, 68, 1–5.

    Article  Google Scholar 

  • Jurman, B. (1989). Ocenjevanje znanja: selekcija ali orientacija učencev Assessment of knowledge: selection or orientation of students. Ljubljana: Državna založba Slovenije.

  • Katalenic, A. (2021). Sistem ocenjevanja se je porušil! Se vračajo eksterci? The rating system has crashed! Are the extraterrestrials coming back?. Novice Svet 24, August 5.

  • Kjeldsen, C.C. (2022). Students ́ Home Resources and Socioeconomic Background Scale (SHRSBS). In: IEA (Eds), Responses to Educational Disruption Survey: User Guide for the International Database (pp. 27–30). IEA.

  • Klemenčič, E. (2019). Odgovor Pedagoškega inštituta na Peticijo za spremembo šolskega sistema – Kaj kažejo mednarodne primerjalne raziskave? [The Educational Research Institute response to the Petition for Change - What does international comparative research show?]. Retrieved February 20, 2023, from. https://www.pei.si/odgovor-pedagoskega-instituta-na-peticijo-za-spremembo-solskega-sistema-kaj-kazejomednarodne-primerjalne-raziskave/

  • Klemenčič Mirazchiyski, E. (2022). Slovenia. In: Reynolds, K.A., Wry, E., Mullis, I.V.S., von Davier, M. (Eds.) PIRLS 2021 Encyclopedia: Education Policy and Curriculum in Reading. Retrieved from Boston College, TIMSS PIRLS International Study Center website: https://pirls2021.org/encyclopedia

  • Klemenčič Mirazchiyski, E., Mirazchiyski, P., Novak, J. (2019). Državljanska vzgoja v Sloveniji : nacionalno poročilo Mednarodne raziskave državljanske vzgoje in izobraževanja (IEA ICCS 2016) [Civic and Citizenship Education in Slovenia: national report IEA ICCS 2016]. Ljubljana: Pedagoški inštitut. https://doi.org/10.32320/978-961-270-301-1

  • Klemenčič Mirazchiyski, E., Pertoci, N., Mirazchiyski, P. (2021). Mednarodna raziskava motenj izobraževanja v času epidemije covida-19 : nacionalno poročilo - prvi rezultati [International study on educational disruption during the epidemic of COVID-19 : national report - first results]. Ljubljana: Pedagoški inštitut.

  • Kuralt, Š. (2020a). Predlagajo, da v tem času učitelji praviloma ne ocenjujejo They suggest that, as a rule teachers do not grade during this time. Delo. November, 20.

  • Kuralt, Š. (2020b, April 18). Ocenjevanje na daljavo ne more biti pravično. Dnevnik. https://www.delo.si/novice/slovenija/ocenjevanje-na-daljavo-ne-more-biti-pravicno

  • Kuralt, Š. (2022). Štejejo samo ocene, znanje pa je skrito očem Only grades count, and knowledge is hidden from view. Delo, February, 7.

  • Lindblad, S., Wärvik, G.-B., Berndtsson, I., Jodal, E.-B., Lindqvist, A., Messina Dahlberg, G., Papadopoulos, D., Runesdotter, C., Samuelsson, K., Udd, J., & Wyszynska Johansson, M. (2021). School lockdown? Comparative analyses of responses to the COVID-19 pandemic in European countries. European Educational Research Journal, 20(5), 564–583. https://doi.org/10.1177/14749041211041237

    Article  Google Scholar 

  • Liu, X., He, W., & Hong, J. C. (2021). Gender Differences in self-regulated online learning during the COVID-19 lockdown. Frontiers in Psychology, 12, 1–8. https://doi.org/10.3389/fpsyg.2021.752131

    Article  Google Scholar 

  • Mathew, S. J. (2017). Self-perception and academic achievement. Indian Journal of Science and Technology. https://doi.org/10.17485/ijst/2017/v10i14/107586

    Article  Google Scholar 

  • Meinck, S., & Fraillon, J. (2022). Introduction to the Responses to Educational Disruption Survey. In S. Meinck, J. Fraillon, & R. Strietholt (Eds.), The impact of the COVID-19 pandemic on education: International evidence from the responses to educational disruption survey (REDS) (pp. 1–5). UNESCO/IEA.

    Google Scholar 

  • Meinck, S., Fraillon, J., & Strietholt, R. (2022). The impact of the COVID-19 pandemic on education: International evidence from the Responses to Educational Disruption Survey (REDS). UNESCO & IEA.

  • Mejía-Rodríguez, A. M., Luyten, H., & Meelissen, M. R. M. (2021). Gender differences in mathematics self-concept across the world: an exploration of student and parent data of TIMSS 2015. International Journal of Science and Mathematics Education, 19(6), 1229–1250. https://doi.org/10.1007/s10763-020-10100-x

    Article  Google Scholar 

  • Mitescu-Manea, M., Safta-Zecheria, L., Neumann, E., Bodrug-Lungu, V., Milenkova, V., & Lendzhova, V. (2021). Inequities in first education policy responses to the COVID-19 crisis: A comparative analysis in four Central and East European countries. European Educational Research Journal, 20(5), 543–563. https://doi.org/10.1177/14749041211030077

    Article  Google Scholar 

  • Mirazchiyski, P. (2016). The digital divide: the role of socioeconomic status across countries. Šolsko polje : revija za teorijo in raziskave vzgoje in izobraževanja, 27(3/4), 23–52

  • Mirazchiyski, P. V. (2021). RALSA: the R analyzer for large‑scale assessments. Large-Scale Assessments in Education, 9(21), 1–24. https://doi.org/10.1186/s40536‑021‑00114‑4

  • Musić, I. (2022). Zakaj imajo otroci boljše ocene, kot so jih imeli njihovi starši? Why do children have better grades than their parents had?. Siol1. August,1.

  • Oinas, S., Hotulainen, R., Koivuhovi, S., Brunila, K., & Vainikainen, M. P. (2022). Remote learning experiences of girls, boys and non-binary students. Computers & Education, 183, 1–12. https://doi.org/10.1016/j.compedu.2022.104499

    Article  Google Scholar 

  • Ozamiz-Etxebarria, N., Berasategi Santxo, N., Idoiaga Mondragon, N., & Dosil Santamaría, M. (2021). The psychological state of teachers during the COVID-19 crisis: the challenge of returning to face-to-face teaching. Frontiers in Psychology, 11, 1–10. https://doi.org/10.3389/fpsyg.2020.620718

    Article  Google Scholar 

  • Paechter, M., Luttenberger, S., Macher, D., Berding, F., Papousek, I., Weiss, M. E., & Fink, A. (2015). The effects of nine-week summer vacation: losses in mathematics and gains in reading. Eurasia Journal of Mathematics, Science and Technology Education., 11(6), 1339–1413. https://doi.org/10.12973/eurasia.2015.1397a

    Article  Google Scholar 

  • Patrinos, H., Vegas, E., & Carter-Rau, R. (2022a). An Analysis of COVID-19 Student Learning Loss. Policy Research Working Paper 10033, World Bank. https://documents1.worldbank.org/curated/en/099720405042223104/pdf/IDU00f3f0ca808cde0497e0b88c01fa07f15bef0.pdf

  • Patrinos, H., Vegas, E., & Carter-Rau, R. (2022b, May 16). COVID-19 school closures fueled big learning losses, especially for the disadvantaged. https://blogs.worldbank.org/developmenttalk/covid-19-school-closures-fueled-big-learning-losses-especially-disadvantaged

  • Pečjak, S., Pirc, T., Podlesek, A., & Peklaj, C. (2021). Some predictors of perceived support and proximity in students during COVID-19 distance learning. International electronic journal of elementary education, 14(1), 51–62. https://doi.org/10.26822/iejee.2021.228

    Article  Google Scholar 

  • Phillips, M., & Chin, T. (2004). How Families Children, and Teachers Contribute to Summer Learning and Loss. In: GD, Borman, M Boulay (Eds). Summer Learning Research Policies and Programs. Lawrence Erlbaum Associates Publishers. Mahwah

  • Quinn; M. D., & Polikoff, M. (2017, September 14). Summer learning loss: What is it, and what can we do about it?. https://www.brookings.edu/research/summer-learning-loss-what-is-it-and-what-can-we-do-about-it/

  • Rosenthal, R., & Jacobsen, L. (1968). Pygmalion in the classroom: Teacher expectation and pupils’ intellectual development. Holt Rinehart and Winston.

    Book  Google Scholar 

  • Rupnik Vec, T., Preskar, S., Slivar, B., Zupanc Grom, R., Deutsch, T., Ivanuš Grmek, M., Mithans, M., Kregar, S., Holcar Brunauer, A., Preskar, S., Bevc, V., Logaj, V., & Musek Lešnik, K. (2020). Analiza izobraževanja na daljavo v času epidemije Covid-19 v Sloveniji (delno poročilo) [Analysis of distance education during the Covid-19 epidemic in Slovenia (partial report)]. Zavod RS za šolstvo.

  • Skinner, C. (2019). Analysis of categorical data for complex surveys. International Statistical Review, 87(S1), S64–S78. https://doi.org/10.1111/insr.12285

    Article  Google Scholar 

  • Škerl Kramberger, U. (2020). Ocenjevanje bo letos milejše, a vseeno bo stresno [Grading will be milder this year, but it will still be stressful]. Dnevnik, May 12.

  • Strmčnik, F. (2001). Didaktika: osrednje teoretične teme [Didactics: central theoretical topics]. Ljubljana: Znanstveni inštitut Filozofske fakultete.

  • Štefanc, D., Makovec Radovan, D., Kalin, J., Mažgon, J., Skubic Ermenc, J., & Šteh, B. (2020). Kaj je treba zagotoviti, da bo ocenjevanje znanja v času izobraževanja na daljavo strokovno legitimno? (odprto pismo) [What needs to be ensured in order for knowledge assessment during distance education to be professionally legitimate? (open letter)]. Sodobna Pedagogika, 71(137), 152–158.

    Google Scholar 

  • Strietholt, R., & Süttmann, F. (2022). Inequalities in teaching and learning during the pandemic. In S. Meinck, J. Fraillon, & R. Strietholt (Eds.), The impact of the COVID-19 pandemic on education: International evidence from the responses to educational disruption survey (REDS) (pp. 184–201). UNESCO/IEA.

    Google Scholar 

  • UNESCO (2021). Framework for re-opening schools supplement: From re-opening to recovery – key resources. unicef.org/media/94871/file/Framework%20for%20Reopening%20Schools%20Supplement-From%20Reopening%20to%20Recovery-Key%20Resources.pdf

  • Uršič, L., & Puklek Levpušček, M. (2020). Učenci zadnje triade OŠ in dijaki o učenju na daljavo med epidemijo COVID-19 [Students of the last triad of elementary school and students on distance learning during the COVID-19 epidemic]. In Ž. Lep & K. Hacin Beyazoglu (Eds.), Psihologija pandemije: posamezniki in družba v času koronske krize (pp. 67–79). Znanstvena založba Filozofske fakultete UL.

  • Whitley, J., Beauchamp, H. M., & Brown, C. (2021). The impact of COVID-19 on the learning and achievement of vulnerable Canadian children and youth. FACETS. https://doi.org/10.1139/facets-2021-0096

    Article  Google Scholar 

  • Wu, J. U., & Cheng, T. (2019). Who is better adapted in learning online within the personal learning environment? Relating gender differences in cognitive attention networks to digital distraction. Computers & Education, 128, 1–12. https://doi.org/10.1016/j.compedu.2018.08.016

    Article  Google Scholar 

  • ZRSŠ (2020b). Izobraževanje na daljavo v posebnih razmerah: Priporočila za ocenjevanje znanja v osnovni šoli [Distance education in special situations: Recommendations for assessment of knowledge in primary school]. https://www.gov.si/assets/ministrstva/MIZS/Dokumenti/Novice/Koronavirus-13-3-20/Priporocila_ocenjevanje-OS_16042020b.pdf

  • Zupanc, D., & Bren, M. (2010). Inflacija pri internem ocenjevanju v Sloveniji. Sodobna pedagogika, 127(3), 208–228.

    Google Scholar 

  • Žveglič Mihelič, M., & Vogrinc, J. (2023, fortcoming). Nameni učiteljevega ocenjevanja znanja: primerjava informativne in motivacijske vrednosti opisnih in številčnih ocen [Purposes of teacher assessment of knowledge: comparing the informative and motivational value of descriptive and numerical assessments]. In: J. Kalin, D. Štefanc (Eds.), Sodobna šola in pouk v luči didaktične zapuščine Franceta Strmčnika (pp. 119–138). Založba UL, Filozofska fakulteta: Ljubljana.

Download references

Acknowledgements

The authors would like to express their acknowledgments to all Slovenian students, teachers and school leaders in the IEA’s REDS study which made it possible to gather the valuable data on the effects of school disruption on the education in the country, as well as to the Slovenian Research Agency for funding this research. This article presents the findings on the student learning, academic performance and student grading during the school disruptions imposed in Slovenia during the COVID-19 pandemic using the data from the IEA’s REDS study.

Funding

The authors acknowledge the financial support from the Slovenian Research Agency (research core funding No. P5-0106 [Educational Research], and project J5-4570 “Effects of COVID-19 Pandemic on Schooling, Teachers and Students: Well-Being, Teaching and Learning”).

Author information

Authors and Affiliations

Authors

Contributions

EKM prepared the conception of the paper, introduction, background of the study, finalized the discussion and prepared the conclusions. PVM prepared the datasets, analyzed the data, described and interpreted the results, prepared the discussion and finalized the conclusions.

Corresponding author

Correspondence to Eva Klemenčič Mirazchiyski.

Ethics declarations

Ethics approval and consent to participate

In this manuscript the officially published REDS data sets were used for the analyses. These data sets were downloaded as public use files from the IEA’s website (https://www.iea.nl/data-tools/repository). Therefore, neither consent to participate or consent for publication nor ethics approval were required for these analyses.

Consent for publication

The authors provide their consent to publish this manuscript upon publication in the Springer open journal Large-scale Assessments in Education.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mirazchiyski, P.V., Klemenčič Mirazchiyski, E. Students’ and teachers’ perceptions of students’ academic outcomes in Slovenia: evidence from REDS data. Large-scale Assess Educ 11, 23 (2023). https://doi.org/10.1186/s40536-023-00173-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40536-023-00173-9

Keywords