PERSPECTIVE article

Front. Educ., 15 May 2025

Sec. Assessment, Testing and Applied Measurement

Volume 10 - 2025 | https://doi.org/10.3389/feduc.2025.1565557

This article is part of the Research TopicHow do we collect all this data? A performative account of International Large-Scale Assessment data collection in times of systemic diversityView all articles

Promoting participation and engagement: Ireland’s experiences across three decades of international large-scale assessments

Emer Delaney
Emer Delaney*Aidan Clerkin
Aidan Clerkin*Anastasios KarakolidisAnastasios KarakolidisLorraine GilleeceLorraine GilleeceDavid MillarDavid Millar
  • Educational Research Centre (ERC), Dublin, Ireland

In large-scale assessments, the collection of high-quality representative data depends in part on (i) securing high participation rates and (ii) ensuring that participants demonstrate a sufficient level of engagement. This article explores the challenges of promoting participation and engagement with reference to Ireland’s experiences throughout multiple cycles of international and national large-scale assessments. Some factors likely to have influenced participation and/or engagement in the Irish context include: (i) the rising profile of large-scale assessments due to their prominence within a national strategy to improve literacy and numeracy; (ii) the transition of studies from paper-based to digital administration; (iii) the publication of new data protection legislation (both European and national); and (iv) pressures resulting from the COVID-19 pandemic. Initiatives implemented by Ireland’s national study centre with the aim of promoting participation in and engagement with large-scale assessments can be broadly classified as relating to consultation, promotion, and support. Lessons from Ireland’s experiences are discussed in relation to future avenues of investigation that may increase our understanding of facilitators and barriers to engagement. Ireland’s experiences offer valuable lessons that could inform practices in other countries.

Introduction

The important influence of international large-scale assessments (ILSAs) on education policy across the world has been noted and Ireland is no exception in this regard (Clerkin and Delaney, 2025). For example, internationally, the OECD’s approach more broadly has been described as “soft governance through putative hard fact” (Niemann and Martens, 2018, p. 267) and in Ireland, its significant influence on education has been recognized (McNamara et al., 2022). In addition to the OECD, others such as the World Bank and the United Nations Educational, Scientific and Cultural Organization (UNESCO) play an instrumental role in global education debates (Elfert and Ydesen, 2023; Steiner-Khamsi et al., 2024). Arguably, “league tables” of cross-national student performance can inspire ill-informed policy borrowing (Klemenčič and Mirazchiyski, 2018) and methodological criticisms have been outlined (e.g., Eivers, 2010) which mean that it is important to ensure that data are used responsibly and interpreted in a manner that is mindful of the national context as well as relevant caveats and limitations. Among other considerations, this includes a requirement that these studies succeed in achieving engaged samples that reflect their intended target populations so that appropriate policy conclusions can be drawn.

In this paper, we explore challenges associated with participation and engagement in ILSAs with reference to experiences in Ireland, where the authors’ institution administers several large-scale school-based assessments on behalf of the Department of Education (DoE).1 Ireland has taken part in multiple cycles of TIMSS2 (in 1995, and again since 2011), PISA3 (since 2000), and PIRLS4 (since 2011), and in 1 cycle each of TALIS5 (2008) and ICCS6 (2009). Large-scale National Assessments of Mathematics and English Reading (NAMER) have also been administered regularly since the 1970s.7

The collection of high-quality representative data in large-scale assessments depends partly on securing high participation rates—first, among sampled schools and, thereafter, among sampled students. Ensuring high participation rates among parents/guardians, teachers, and school principals—who are typically asked to complete questionnaires—is also crucial so that students’ achievement data can be contextualized appropriately. To ensure data quality, the organizations responsible for coordinating ILSAs stipulate that countries must meet certain technical standards, including minimum response rates based on school, class (where relevant), and student participation (Almaskut et al., 2023; LaRoche et al., 2020; OECD, 2024b).8

Beyond participation, participants’ engagement with the assessment also impacts data validity. Disengaged test-takers (e.g., rapid guessers) are likely to underperform relative to their true ability, while disengaged questionnaire respondents are less likely to provide accurate information (e.g., Hopfenbeck and Maul, 2011; Michaelides et al., 2024). Student engagement levels with ILSAs can vary significantly across participating countries, which may compromise the comparability of results (Guo and Ercikan, 2021). However, rapid guessing consistently negatively impacts performance regardless of language, culture, or country (Guo and Ercikan, 2021). A fall-off in student engagement in PISA 2009 was identified as one contributor to Ireland’s weaker performance in that cycle (Cosgrove and Cartwright, 2014). Moreover, it is important that engagement levels do not vary systematically by subgroup, which would introduce construct-irrelevant variance for some sections of the population. For example, Cosgrove (2011) shows some variation by student socioeconomic background and gender in engagement in PISA 2009 in Ireland, while Perkins (2015) similarly reported differences in engagement in PISA 2012 by gender and school socioeconomic status.

With occasional exceptions, the participation standards set by international bodies have been met in Ireland, with response rates often high relative to rates internationally.9 Nevertheless, some variation across cycles and studies has been observed (Table 1). In particular, there was a rise in school participation post-2011; student participation has consistently been higher at primary level than at post-primary; and a slight dip in participation rates (mainly at post-primary) coincided with the COVID-19 pandemic.

Table 1
www.frontiersin.org

Table 1. School and individual response rates in large-scale assessments conducted in Ireland.

Contextual factors likely to have influenced participation and engagement in Ireland are explored next. Subsequently, initiatives implemented by the authors’ institution to target improved participation and engagement are described, and their impact considered. Key lessons learned from Ireland’s experience are discussed, along with future research possibilities that could yield insights into facilitators and barriers to engagement.

What factors may have influenced participation and engagement?

Where participation rates have fluctuated, it is not possible to prove causation and the likelihood is that multiple factors are involved. The same is true of engagement levels, with the additional caveat that these are more difficult to quantify. Nevertheless, in this section we explore some factors that seem likely to have contributed to changes in participation and/or engagement in large-scale assessments in Ireland over the past 15 years.

National strategy: literacy and numeracy for learning and life 2011–2020

In 2011, a flagship National Strategy: Literacy and Numeracy for Learning and Life (DES (Department of Education and Skills), 2011) was launched, partly in response to the results of PISA 2009, which appeared to show a substantial drop relative to previous cycles in the reading and mathematics proficiency of 15-year-olds in Ireland (Perkins et al., 2010). The Strategy committed to Ireland’s continuing participation in ILSAs such as PIRLS, TIMSS, and PISA, as well as NAMER, and, for the first time, specified that schools sampled for these studies were expected to participate.

A practical corollary of this specification was that the national study centre (NSC) and the Department began to work more closely together on school recruitment.10 For instance, the initial invitation materials sent to sampled schools now routinely included letters both from a Department representative (typically, a senior inspector) and from the national study coordinator. Local inspectors, with whom schools often have a pre-existing relationship, also played a role in addressing concerns related to participation, highlighting the importance of the study and its value to the education system, and offering additional advice and support.

As illustrated in Table 1, while school-level participation in large-scale assessments in Ireland was always high, it has been higher and more consistent following the implementation of the Strategy.11 For example, school-level participation in PISA ranged from 88% to 100% between 2000 and 2009, but from 97% to 100% between 2012 and 2022. Thus, it seems likely that the Strategy—including increased involvement by the Department in recruitment procedures—had a positive impact on school-level response rates.

Study modes: transitioning from paper to digital assessment

PISA, TIMSS, and PIRLS have gradually transitioned from paper-based to digital testing in recent cycles, and Ireland, on the whole, has been an early adopter of digital testing in this context (Delaney et al., 2024). However, initial efforts to use schools’ own devices for ILSA testing proved problematic. For example, surveys and site visits conducted as part of the field trials for PISA 2015 and PIRLS 201612 indicated that school devices were potentially unreliable as well as insufficient in quantity. As a result, the NSC has since been responsible for providing schools with both devices and technical support staff to manage their setup, troubleshoot as needed, and upload test data.

While the provision of technical support personnel was intended to reduce the burden on school staff, aspects of computer-based testing presented schools with new logistical challenges (e.g., arranging appropriate rooms for testing; limited flexibility to reschedule test dates; additional time needed to set up hardware). These issues caused significant dissatisfaction in some schools. However, overall, school-level response rates do not appear to have been negatively impacted by the move toward digital testing (Table 1).

PISA 2025 marks the first ILSA in Ireland to be conducted online. Unlike offline computer-based assessments, the online PISA assessment can be completed on any device that meets the minimum technical requirements and has internet access, without needing special software. While this offers significant flexibility for the NSC, it can pose challenges for some schools because of variation in the availability of schools’ own infrastructure. This issue was addressed in Ireland’s PISA field trial (held in 2024) by offering schools the alternative of using devices and portable internet solutions provided by the NSC, or opting for an offline assessment mode.

Some evidence suggests that moving to computer-based testing may have increased student engagement in some respects (e.g., Cosgrove and Moran, 2011). In PIRLS 2016, a higher proportion of students reported liking the digital ePIRLS texts (an average of 92% across five texts) than the comparable paper-based PIRLS texts (averaging 82% across six informational texts). Furthermore, students in Ireland were slightly less positive than the international average about paper-based texts and slightly more positive about digital texts (Eivers and Delaney, 2018). Another aspect is that computer-based assessments facilitate innovative interactive items that can enhance not only the validity of the assessment but also students’ engagement with it. However, such items can also present challenges, particularly for students who are engaging with interactive, simulation-based tasks for the first time (Shiel et al., 2016).

Overall, while the use of computer-based testing has introduced additional work in certain areas for Ireland’s NSC and school staff, there is no evidence that it has impacted on participation (negatively or positively) and indeed it may have enhanced student engagement. The impact of study mode on engagement remains an important focus of analysis in Ireland, given the transition to digital testing in TIMSS 2023 and PIRLS 2026.

Changes to data protection legislation and practices

As well as changes to the policy landscape or study design, ostensibly unrelated external events can impact on the practical administration of ILSAs and their participation rates. An example is the effect on successive cycles of TIMSS in Ireland of the introduction of two pieces of data protection legislation: the EU-wide General Data Protection Regulation (GDPR) in May 2018, followed by the Irish Data Sharing and Governance Act (DSGA) governing the sharing of personal data between public bodies, the final phase of which came into effect in 2022. In both cases, the new laws came into effect midway through TIMSS cycles, necessitating rapid responses to ensure compliance while maintaining high participation rates.

The timing of GDPR was significant as its introduction—accompanied by a sustained high-profile media focus on individuals’ data privacy and data protection—coincided with the TIMSS 2019 field trial (March/April 2018). Student participation rates in this field trial were noticeably lower than expected relative to previous assessments. In particular, the field trial saw an unusually high level of parental withdrawals, with 7% of Fourth grade students withdrawn in 2018 compared to just 0.2% in the TIMSS 2015 main study (Clerkin et al., 2016). Informal reports from schools and direct contact to the NSC indicated that concerns about data privacy and data-sharing were a key driver. Language-related difficulties encountered by some parents/guardians in reading the information provided about granting/refusing permission also played a role, and were exacerbated by the increasingly complex communications drafted to comply with GDPR requirements. In response, communications to parents/guardians were revised for the 2019 main study to provide more information in a clearer format, targeted to be accessible to adults with lower literacy levels. These revised information letters were also made available in several languages to reduce misunderstandings. In the end, the 2019 main study saw a Fourth grade withdrawal rate of 4.6% (Perkins and Clerkin, 2020)—substantially lower than in the field trial, but nonetheless higher than in previous cycles. This may have been partly related to the ongoing high profile of GDPR and public discussion of data privacy at this time. More recently, as procedures and communications about withdrawals have improved and public attitudes to data sharing have stabilized, withdrawal rates appear to have reverted to previous levels, with 0.3% of Fourth grade students withdrawn in TIMSS 2023 (McHugh et al., 2024b).

The introduction of DSGA brought significant challenges to data sharing between public bodies in Ireland. While most publicly-funded schools in Ireland are not classified as public entities under DSGA, schools managed by Education and Training Boards (ETBs)—about one-third of post-primary schools—are. This had practical implications as, from December 2022, ETB schools and the NSC (also a public entity) could no longer exchange personal data without a legal agreement, requiring months of external review. This disrupted preparations for TIMSS 2023, with testing arrangements stalled and uncertainty about ETB school participation threatening the representativeness of the sample and the study’s validity. In the end, legal agreements were reached just as the main study testing window opened, meaning that affected schools could participate only after a major effort by both the NSC and school staff to put logistical arrangements in place. In that light, it is noteworthy that final school participation rates in TIMSS 2023 remained very high (McHugh et al., 2024b).

COVID-19 and disruption to education

In Ireland, as elsewhere, COVID-19 caused severe disruption to education and society. Two periods of nationwide school closures were implemented,13 during which teaching and learning were expected to continue remotely. Following each closure period, guidance was issued to schools to help them support students and limit the risk of contagion during the return to in-person instruction. Teachers were advised to prioritize certain curricular areas—literacy, numeracy, and wellbeing—while risk mitigation in classrooms included the allocation of students to small groups, seating layout adjustments, and use of face coverings. Thus, even when schools were open for in-person education during 2020 and 2021, the learning environment was atypical (Pitsia et al., 2024).

Because of the pandemic, the scope of NAMER 2021 was reduced, with students tested in one subject only at each grade level (Kiniry et al., 2023). For PIRLS 2021, testing in 14 countries including Ireland was delayed by about 6 months (Delaney et al., 2023).

Despite the challenging circumstances, school-level participation remained high in both studies. Student-level participation was also high, albeit slightly lower than in previous cycles (Table 1). However, the collection of parent/guardian questionnaire data was rendered more difficult by COVID-19 restrictions. In NAMER, it was decided not to administer parent/guardian questionnaires for this cycle. In PIRLS, both online and paper options were offered to parents/guardians in Ireland, and a minority of schools chose to use online questionnaires exclusively due to concerns about contamination from hard-copy materials. In schools that offered both options, most parents/guardians who returned a questionnaire did so on paper. In schools that offered the online option only, response rates were notably lower. Whether due to lack of access to technology or confidence in its use among some parents, or to the more limited scope for teachers to track online questionnaire returns and issue reminders where needed, it appears that providing a paper-based option yields a higher response rate among parents/guardians in Ireland.

In contrast to NAMER and PIRLS—where the pandemic impacted administration but did not significantly affect response rates—the PISA 2022 study at post-primary level was not only delayed but also affected by lower response rates. Originally planned for 2021, this PISA cycle was delayed internationally by 1 year due to COVID-19 and took place in Ireland in autumn 2022. School-level engagement remained very high. However, the final weighted student response rate in Ireland (77%) did not meet the required minimum of 80% (Donohue et al., 2023b). Therefore, a non-response bias analysis was conducted to compare national state examination results for the achieved sample and the full sample. The results indicated a small upward bias in PISA 2022 achievement estimates compared to previous cycles in Ireland (Donohue et al., 2023a). Ireland’s PISA results, along with those from 11 other countries/economies, appeared in the international reports with an annotation.

The contrasting impact of COVID-19 on student-level participation in primary versus post-primary assessments reflects a pattern seen to a lesser degree in previous study cycles in Ireland. Probable contributing factors include higher absence rates and higher likelihood of student (as opposed to parent/guardian) refusals among older cohorts, who may also be preparing for state examinations. In addition, many Grade 10 students participate in the 1-year “Transition Year” programme, which often features out-of-school activities, meaning students are unavailable for testing (Clerkin, 2019, 2020).

What initiatives have been introduced to improve participation and engagement?

Initiatives deployed by Ireland’s NSC to increase participation and engagement in large-scale assessments can be described under three main categories: consultation, promotion, and support.

Consultation

Each assessment is supported by an Advisory Committee comprised of representatives from the Department; key stakeholders such as representative bodies for principals, teacher unions, other educational agencies, and parents’ associations; and/or subject experts. The Advisory Committee is briefed periodically by the NSC project team and provides feedback and guidance on planning, administration, and reporting. This input has allowed concerns that might impact on participation and engagement to be flagged at an early stage and addressed. Examples of issues raised include the need to provide substitute cover for teachers attending training for ILSAs; the benefits of providing information for parents/guardians in multiple languages; advice on providing plain language information about data protection; and insights into foreseeable practical challenges of bringing computers into schools for test sessions. Committee members can also help to raise awareness of the ongoing studies in their organizations and have helped to identify research topics of interest for secondary reporting.

Promotion

Often with assistance from Advisory Committee members, information about large-scale assessments is published in professional outlets (such as magazines or newsletters circulated by principals’ representative bodies or teachers’ unions) and sometimes presented at relevant professional conferences. It is hoped that making information about the studies’ purposes and findings available to school staff through familiar, trusted channels will support both participation and engagement.

For students, parents/guardians, and teachers, informational videos have been developed in recent cycles of PIRLS, TIMSS, and PISA. Student videos are designed with the target grade(s) in mind and provide accessible explanations of sampling, instruments, data protection, and other study aspects. Teachers may show the videos to their classes and discuss before testing, with the aim of increasing students’ understanding of the study and, consequently, their engagement with it. Parent/guardian videos include information about the aims of the study and the purpose and processes of data collection. The intention is pre-emptively to address foreseeable concerns, thereby minimizing refusal rates and increasing parents’ motivation to contribute to the study. Videos for teachers and school staff focus on educating the school community about the importance of participation in studies and, when applicable, providing guidelines for proper administration. More recently, these videos have been made available on a range of social media platforms to ensure accessibility for various interest groups.

In some cycles of some studies, tokens of appreciation have been provided to thank students for their participation and encourage positive feelings toward the research. For example, in ICCS 2009 and PISA 2009, prize draws for vouchers were used to incentivize student participation, while in TIMSS 2015, PIRLS 2016, and PISA 2018, promotional pens were supplied with test materials. In PISA 2025, schools were provided with vouchers to offer snacks, food, and/or refreshments to students taking part in the study. Also, all participating students and schools received a certificate acknowledging their valuable contributions to the study. However, whether these measures had any effect on student engagement is unknown.

The launch of national reports on ILSAs is timed in Ireland to coincide with the international release. This helps to ensure media attention, raise public awareness, and promote future engagement. The launch typically features a speech from the Minister of Education or another senior Department figure, with stakeholder representatives invited as a mark of appreciation.

As a token of thanks for their participation, schools typically receive a brief summary of key school-level findings about student performance on the assessments and other relevant factors, such as interest in mathematics/science/reading and wellbeing. Schools are given guidance on appropriately cautious interpretation, since the assessments are designed to produce population-level estimates. Feedback is distributed separately to each participating school, ensuring confidentiality.

Finally, recent years have seen an increasing focus on disseminating findings in an accessible way for key stakeholder groups such as teachers, parents/guardians, and students. Infographics conveying highlights from the longer reports are routinely shared on social media and online. Additionally, some reports focus specifically on topics intended to be useful to teachers. For instance, McHugh et al. (2024a) used TIMSS data to explore relative strengths and weaknesses of students in Ireland in mathematics and science, with detailed discussion of performance across topic areas and individual items relative to the relevant Irish curricula, while Perkins and Shiel (2016) created a “teacher’s guide” which included links between the PISA 2012 mathematics and problem-solving results and the (then) recently updated mathematics curriculum. This reflects an increasing international trend whereby ILSA publications are targeted to teachers, such as the IEA’s Teacher Snippet series.14

Support

Large-scale assessments are typically administered at a time when school staff and students are under significant time pressure. For example, in the spring, when large-scale assessments are typically administered, post-primary students are often intensifying preparations for State examinations. Similarly, primary schools and students often have an increasing range of commitments at this time of year, such as sporting or other competitions and preparation for religious sacraments. Providing external support is therefore vital to ensuring strong participation and engagement. In Ireland, large-scale assessments are conducted differently at primary and post-primary levels to provide schools with the support they need. At primary level, teachers in the school typically serve as test administrators. In contrast, at post-primary level where testing arrangements can be more complex (involving several class groups or grade levels), external test administrators are often assigned to participating schools. At both levels, technical support engineers are also provided to manage the technical aspects of computer-based assessments. These external staff are recruited and trained by the NSC to minimize disruptions for students and teachers.

All schools are asked to appoint a coordinator to liaise between the school, the NSC, and the test administrators. To ensure that these coordinators feel well-supported to carry out their responsibilities, the NSC typically holds in-person or online events to familiarize coordinators with test procedures. For example, in PIRLS 2016, training sessions were held at a number of regional locations to provide coordinators with the chance to try the ePIRLS software. Similarly, during the PISA 2025 field trial, several well-attended Q&A sessions were organized, providing coordinators with reminders of their key tasks and an opportunity to address questions or concerns. Feedback from coordinators was positive and similar sessions will be incorporated into the main study.

Translating materials into languages relevant to participants is also a crucial element of support. In Ireland, this means that tests, questionnaires, letters, manuals, and other resources such as videos are routinely provided in both Irish and English (Irish is the medium of instruction in approximately 8% of schools). High-quality Irish-language versions provide stakeholders in Irish-medium schools with the same opportunity and motivation to participate and engage in studies as their counterparts in English-medium schools. Additionally, parents’/guardians’ information is now routinely translated into additional languages commonly used in Ireland, such as Polish, Lithuanian, Romanian, and Ukrainian; questionnaires for parents/guardians are also sometimes made available in multiple languages. The intention is to reduce any systematic disadvantage for these subgroups which might affect their participation.

Discussion

Examining the history of participation rates in large-scale assessments in Ireland shows that it can be challenging to predict how these may be affected by external events. Some developments that might be expected to affect participation rates directly (such as the move from paper to digital testing) do not seem to have had any notable impact—perhaps partly due to mitigation measures implemented by the NSC. On the other hand, developments that seemed further removed (such as the introduction of new data protection legislation) have coincided with substantial fluctuations in participation.

To secure high participation and engagement, a key lesson from Ireland’s experience relates to the importance of stakeholder buy-in. At national level, it seems likely that the recognition afforded to large-scale assessments in the first National Strategy for Literacy and Numeracy (DES (Department of Education and Skills), 2011)—continued in the successor National Literacy, Numeracy and Digital Literacy Strategy 2024–2033 (DoE (Department of Education), 2024)—may have supported school-level participation and raised awareness of the purpose of large-scale assessments, which may also have positively impacted engagement. The inclusion of stakeholder representatives such as parents/guardians, teachers, and principals on Advisory Committees has helped the NSC to pre-empt concerns and develop strategies to boost participation and engagement among these groups.

More generally, it is desirable to enhance further the shared understanding among education partners of the value of high-quality data for monitoring and evaluation in education. At school level, there is a recognized need in Ireland for training and guidance in using data to inform planning for school improvement (OECD, 2024a). There is also a recognized need for better system-level student data to support monitoring and evaluation (Gilleece and Clerkin, 2024; OECD, 2024a). Given the current reliance on non-administrative data sources as a result of limited administrative data, it is imperative that all education stakeholders in Ireland recognize the importance of securing high participation rates in representative large-scale assessments. Guidance for school leaders and policy-makers should focus not only on how data (such as standardized test results) can be used for planning and improvement at the school-level, but also on the wider value of data (such as large-scale assessment findings) for system improvement (Clerkin and Delaney, 2025).

While participation rates can be quantified with high precision, engagement remains more elusive to measure. However, process data generated from digital assessments provide new avenues of enquiry in this regard. Variables such as response latencies and navigation choices have been suggested as proxies for student engagement with tests. Cosgrove and Cartwright (2014) go so far as to suggest that the reporting of ILSA results should incorporate response latencies in the estimation of either individual student estimates or aggregate performance, given the importance of active engagement with assessment tasks as well as cognitive proficiency. Despite the impact that engagement can have on student scores, its impact on aggregate scores and country rankings has been shown to be minimal based on analysis of PISA 2015 results (Michaelides et al., 2024). As Ireland progresses further in using digital modalities for large-scale assessment, new possibilities for understanding patterns of engagement among the population and subgroups can be anticipated.

Data availability statement

Publicly available datasets were analyzed in this study. This data can be found at: https://timssandpirls.bc.edu/pirls2016/international-database/index.html. Please note that data reported in international and (Irish) national reports on various cycles of ICCS, PIRLS, PISA, TIMSS, and TALIS, as well as reports on Ireland’s National Assessments of English Reading and Mathematics, were also collated for the purposes of this article (specifically for Table 1).

Ethics statement

Ethical approval was not required for the study involving humans in accordance with the local legislation and institutional requirements. Written informed consent to participate in this review was not required from the participants or the participants’ legal guardians/next of kin in accordance with the national legislation and the institutional requirements.

Author contributions

ED: Conceptualization, Writing – original draft, Writing – review & editing. AC: Conceptualization, Writing – original draft, Writing – review & editing. AK: Conceptualization, Writing – original draft, Writing – review & editing. LG: Conceptualization, Writing – original draft, Writing – review & editing. DM: Conceptualization, Writing – original draft, Writing – review & editing.

Funding

The author(s) declare that no financial support was received for the research and/or publication of this article.

Acknowledgments

The authors would like to thank Andrew Keating for his assistance in reviewing an earlier version of the manuscript.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Generative AI statement

The authors declare that no Gen AI was used in the creation of this manuscript.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Footnotes

1. ^Previously known as the Department of Education and Skills [DES]. The term “the Department” is used throughout this article, with acronyms DoE and DES applied only in relation to references.

2. ^Trends in International Mathematics and Science Study.

3. ^Programme for International Student Assessment.

4. ^Progress in International Reading Literacy Study.

5. ^Teaching and Learning International Survey.

6. ^International Civics and Citizenship Education Study.

7. ^Large-scale assessments outside school settings also exist – for example, PIAAC (Survey of Adult Skills), which assesses the skills of adults aged 16–65 with data collection generally occurring in participants’ homes. Ireland has taken part in PIAAC in both Cycle 1 (c. 2012) and Cycle 2 (c. 2023), as well as the precursor International Adult Literacy Survey (1995). However, the focus of this paper is on school-based large-scale assessments.

8. ^For schools and classes, response rates are typically estimated based on the participation rate of sampled eligible students. Response rates at the student level are typically estimated based on the number of participating students relative to the total number of eligible students, excluding those who do not meet the eligibility criteria and those with special educational needs who are unable to participate.

9. ^Two exceptions were TALIS 2008 (lower school response rate) and PISA 2022 (lower student response rate). In both instances, further information was required to estimate the extent of bias in the achieved sample (Donohue et al., 2023a; Gilleece et al., 2009).

10. ^A coordinated approach to school recruitment was sometimes deployed pre-2011 – for example, in ICCS 2009, sampled schools that had not responded to the NSC’s initial invitation received letters from the Department encouraging them to participate. However, this approach became more systematic during the lifetime of the Strategy.

11. ^As the Strategy was released midway through 2011, the TIMSS and PIRLS cycle administered in spring 2011 was pre-Strategy.

12. ^While PIRLS 2016 was paper-based, an optional digital literacy assessment (ePIRLS) was administered in some countries (Eivers et al., 2017). ePIRLS required students to navigate a hyperlinked environment simulating the internet, focusing on informational rather than literary texts.

13. ^From mid-March to June 2020, and from January to March/April 2021.

14. ^https://www.iea.nl/publications/iea-teacher-snippets

References

Almaskut, A., LaRoche, S., and Foy, P. (2023). “Sample design in PIRLS 2021” in Methods and procedures: PIRLS 2021 technical report. eds. M. Von Davier, I. V. S. Mullis, B. Fishbein, and P. Foy (Boston, MA: TIMSS & PIRLS International Study Center, Boston College).

Google Scholar

Clerkin, A. (2019). The transition year experience: Student perceptions and school variation. Dublin: Educational Research Centre.

Google Scholar

Clerkin, A. (2020). A three-wave longitudinal assessment of socioemotional development in a year-long school-based ‘gap year. Br. J. Educ. Psychol. 90, 109–129. doi: 10.1111/bjep.12267

PubMed Abstract | Crossref Full Text | Google Scholar

Clerkin, A., and Delaney, E. (2025). How large-scale assessments have informed education policy in Ireland. IEA Compass: Briefs in Education No 26. IEA. Available online at: https://www.iea.nl/sites/default/files/2025-01/CB26-LSA-Data-Informing-Policy-Ireland.pdf (Accessed April 25, 2025).

Google Scholar

Clerkin, A., Perkins, R., and Cunningham, R. (2016). TIMSS 2015 in Ireland: Mathematics and science in primary and post-primary schools. Dublin: Educational Research Centre.

Google Scholar

Cosgrove, J. (2011). Does student engagement explain performance on PISA? Comparisons of response patterns on the PISA tests over time. Dublin: Educational Research Centre.

Google Scholar

Cosgrove, J., and Cartwright, F. (2014). Changes in achievement on PISA: the case of Ireland and implications for international assessment practice. Large-Scale Assessments Educ. 2, 1–17. doi: 10.1186/2196-0739-2-2

PubMed Abstract | Crossref Full Text | Google Scholar

Cosgrove, J., Gilleece, L., and Shiel, G. (2011). International civic and citizenship study: Report for Ireland. Dublin: Educational Research Centre.

Google Scholar

Cosgrove, J., and Moran, G. (2011). Taking the PISA 2009 test in Ireland: Students’ response patterns on the print and digital assessments. Dublin: Educational Research Centre.

Google Scholar

Cosgrove, J., Shiel, G., Sofroniou, N., Zastrutzki, S., and Shortt, F. (2005). Education for life: The achievement of 15-year-olds in Ireland in the second cycle of PISA. Dublin: Educational Research Centre.

Google Scholar

Delaney, E., McAteer, S., Delaney, M., McHugh, G., and O’Neill, B. (2023). PIRLS 2021: Reading results for Ireland. Dublin: Educational Research Centre.

Google Scholar

Delaney, E., O’Flaherty, A., Perkins, R., Clerkin, A., and Cunningham, R. (2024). “Computer-based testing in Ireland, 2005-2024: challenges, lessons learned, and future possibilities” in E-testing and computer-based assessment: CIDREE yearbook 2024. eds. B. Ranđelović, E. Karalić, K. Aleksić, and D. Đukić (Belgrade: Institute for Education Quality and Evaluation).

Google Scholar

DES (Department of Education and Skills) (2011). Literacy and numeracy for learning and life: The national strategy to improve literacy and numeracy among children and young people 2011–2020. Dublin: DES.

Google Scholar

DoE (Department of Education) (2024). Ireland’s Literacy, Numeracy and Digital Literacy Strategy 2024-2033: EveryLearner from Birth to Young Adulthood. Available at: https://assets.gov.ie/293255/a509a8d7-a4ac-43f9-acb0-29cdc26a1327.pdf

Google Scholar

Donohue, B., Moran, E., Clerkin, A., Millar, D., O’Flaherty, A., Piccio, G., et al. (2024). Digital learning framework (DLF) national longitudinal evaluation: Wave 2 final report. Dublin: Educational Research Centre.

Google Scholar

Donohue, B., Perkins, R., Millar, D., Walsh, T., and Delaney, M. (2023a). PISA 2022: Non-response bias analysis for Ireland. Dublin: Educational Research Centre.

Google Scholar

Donohue, B., Perkins, R., Walsh, T., O’Neill, B., Duibhir, Ó., and Duggan, A. (2023b). Education in a dynamic world: The performance of students in Ireland in PISA 2022. Dublin: Educational Research Centre.

Google Scholar

Eivers, E. (2010). PISA: issues in implementation and interpretation. Ir. J. Educ. 38, 94–118.

Google Scholar

Eivers, E., Close, S., Shiel, G., Millar, D., Clerkin, A., Gilleece, L., et al. (2010). The 2009 National Assessments of mathematics and English Reading. Dublin: Educational Research Centre.

Google Scholar

Eivers, E., and Delaney, M. (2018). PIRLS and ePIRLS 2016. Test content and Irish pupils’ performance. Dublin: Educational Research Centre.

Google Scholar

Eivers, E., Gilleece, L., and Delaney, E. (2017). Reading achievement in PIRLS 2016: Initial report for Ireland. Dublin: Educational Research Centre.

Google Scholar

Elfert, M., and Ydesen, C. (2023). “UNESCO, the OECD and the World Bank: A global governance perspective” in Global governance of education: The historical and contemporary entanglements of UNESCO, the OECD and the World Bank. eds. M. Elfert and C. Ydesen (Cham: Springer International Publishing), 23–50.

Google Scholar

Foy, P. (1997). “Implementation of the TIMSS sample design” in TIMSS technical report volume II: Implementation and analysis (primary and middle school years). eds. M. O. Martin and D. L. Kelly (Boston, MA: TIMSS International Study Center, Boston College), 21–46.

Google Scholar

Gilleece, L., and Clerkin, A. (2024). Towards more robust evaluation of policies and programmes in education: identifying challenges in evaluating DEIS and Reading recovery. Ir. Educ. Stud. 22, 1–29. doi: 10.1080/03323315.2024.2334704

Crossref Full Text | Google Scholar

Gilleece, L., Shiel, G., and Perkins, R. (2009). Teaching and learning international survey (2008). National report for Ireland. Dublin: Educational Research Centre.

Google Scholar

Guo, H., and Ercikan, K. (2021). Differential rapid responding across language and cultural groups. Educ. Res. Eval. 26, 302–327. doi: 10.1080/13803611.2021.1963941

PubMed Abstract | Crossref Full Text | Google Scholar

Hopfenbeck, T. N., and Maul, A. (2011). Examining evidence for the validity of PISA learning strategy scales based on student response processes. Int. J. Test. 11, 95–121. doi: 10.1080/15305058.2010.529977

Crossref Full Text | Google Scholar

Joncas, M. (2012a). “Meeting PIRLS 2011 standards for sampling participation” in Methods and procedures in TIMSS and PIRLS 2011. eds. M. O. Martin and I. V. S. Mullis (Boston, MA: TIMSS & PIRLS International Study Center, Boston College).

Google Scholar

Joncas, M. (2012b). “Meeting TIMSS 2011 standards for sampling participation” in Methods and procedures in TIMSS and PIRLS 2011. eds. M. O. Martin and I. V. S. Mullis (Boston, MA: TIMSS & PIRLS International Study Center, Boston College).

Google Scholar

Kiniry, J., Duggan, A., Karakolidis, A., Cunningham, R., and Millar, D. (2023). NAMER 2021. The National Assessments of mathematics and English Reading. Performance report. Dublin: Educational Research Centre.

Google Scholar

Klemenčič, E., and Mirazchiyski, P. V. (2018). League tables in educational evidence-based policy-making: can we stop the horse race, please? Comp. Educ. 54, 309–324. doi: 10.1080/03050068.2017.1383082

PubMed Abstract | Crossref Full Text | Google Scholar

LaRoche, S., and Foy, P. (2017). “Sample implementation in PIRLS 2016” in Methods and procedures in PIRLS 2016. eds. M. O. Martin, I. V. S. Mullis, and P. Foy (Boston, MA: TIMSS & PIRLS International Study Center, Boston College), 5.1–5.126.

Google Scholar

LaRoche, S., Joncas, M., and Foy, P. (2020). “Sample design in TIMSS 2019” in Methods and procedures: TIMSS 2019 technical report. eds. M. O. Martin, M. von Davier, and I. V. S. Mullis (Boston, MA: TIMSS & PIRLS International Study Center, Boston College).

Google Scholar

McHugh, G., Clerkin, A., Cunningham, R., and Perkins, R. (2024a). An in-depth analysis of the relative strengths and weaknesses of students in Ireland in mathematics and science in TIMSS 2019. Dublin: Educational Research Centre.

Google Scholar

McHugh, G., Denner, S., Clerkin, A., Piccio, G., and Pitsia, V. (2024b). TIMSS 2023: Insights into mathematics and science achievement in Ireland. Dublin: Educational Research Centre.

Google Scholar

McKeown, C., Denner, S., McAteer, S., Shiel, G., and O'Keeffe, L. (2019). Learning for the future: The performance of 15-year-olds in Ireland on reading literacy, science and mathematics in PISA 2018. Dublin: Educational Research Centre.

Google Scholar

McNamara, G., Skerrit, C., O'Hara, J., O'Brien, S., and Brown, M. (2022). For improvement, accountability, or the economy? Reflecting on the purpose(s) of school self-evaluation in Ireland. J. Educ. Admin. History 54, 158–173. doi: 10.1080/00220620.2021.1985439

Crossref Full Text | Google Scholar

Michaelides, M. P., Ivanova, M. G., and Avraam, D. (2024). The impact of filtering out rapid-guessing examinees on PISA 2015 country rankings. Psychol. Test Assess. Model. 66, 50–62. doi: 10.2440/001-0012

Crossref Full Text | Google Scholar

Niemann, D., and Martens, K. (2018). Soft governance by hard fact? The OECD as a knowledge broker in education policy. Global Social Policy 18, 267–283. doi: 10.1177/1468018118794076

PubMed Abstract | Crossref Full Text | Google Scholar

OECD (2009). PISA 2006 technical report. Paris: OECD Publishing.

Google Scholar

OECD (2010). TALIS 2008 technical report. Paris: OECD Publishing.

Google Scholar

OECD (2024a). OECD review of resourcing schools to address educational disadvantage in Ireland. Paris: OECD Publishing.

Google Scholar

OECD (2024b). PISA 2022 technical report. Paris: OECD Publishing.

Google Scholar

Perkins, R. (2015). The role of engagement and test-taking behaviour in PISA 2012 in Ireland. Ir. J. Educ. 40, 45–67.

Google Scholar

Perkins, R., and Clerkin, A. (2020). TIMSS 2019: Ireland’s results in mathematics and science. Dublin: Educational Research Centre.

Google Scholar

Perkins, R., Cosgrove, J., Moran, G., and Shiel, G. (2012). PISA 2009: Results for Ireland and changes since 2000. Dublin: Educational Research Centre.

Google Scholar

Perkins, R., Moran, G., Cosgrove, J., and Shiel, G. (2010). PISA 2009: The performance and progress of 15-year-olds in Ireland. Summary report. Dublin: Educational Research Centre.

Google Scholar

Perkins, R., and Shiel, G. (2016). “A teacher’s guide to PISA mathematics and problem solving” in Findings from PISA 2012 (Dublin: Educational Research Centre).

Google Scholar

Perkins, R., Shiel, G., Merriman, B., Cosgrove, J., and Moran, G. (2013). Learning for life: The achievements of 15-year-olds in Ireland on mathematics, Reading literacy and science in PISA 2012. Dublin: Educational Research Centre.

Google Scholar

Pitsia, V., McAteer, S., McHugh, G., and Delaney, E. (2024). PIRLS 2021: Exploring the contexts for reading of primary school pupils in Ireland. Dublin: Educational Research Centre.

Google Scholar

Shiel, G., Cosgrove, J., Sofroniou, N., and Kelly, A. (2001). Ready for life? The literacy achievements of Irish 15-year-olds with comparative international data. Dublin: Educational Research Centre.

Google Scholar

Shiel, G., Kavanagh, L., and Millar, D. (2014). The 2014 National Assessments of English Reading and mathematics. Performance report. Dublin: Educational Research Centre.

Google Scholar

Shiel, G., Kelleher, C., McKeown, C., and Denner, S. (2016). Future ready? The performance of 15-year-olds in Ireland on science, reading literacy and mathematics in PISA 2015. Dublin: Educational Research Centre.

Google Scholar

Steiner-Khamsi, G., Martens, K., and Ydesen, C. (2024). Governance by numbers 2.0: policy brokerage as an instrument of global governance in the era of information overload. Comp. Educ. 60, 537–554. doi: 10.1080/03050068.2024.2308348

Crossref Full Text | Google Scholar

Keywords: large-scale assessment (LSA), ILSA, educational measurement and assessment, participation, engagement, TIMSS/PIRLS, PISA, national assessment

Citation: Delaney E, Clerkin A, Karakolidis A, Gilleece L and Millar D (2025) Promoting participation and engagement: Ireland’s experiences across three decades of international large-scale assessments. Front. Educ. 10:1565557. doi: 10.3389/feduc.2025.1565557

Received: 23 January 2025; Accepted: 10 April 2025;
Published: 15 May 2025.

Edited by:

Surette Van Staden, University of Innsbruck, Austria

Reviewed by:

Hongwen Guo, Educational Testing Service, United States
Klaus Buddeberg, University of Hamburg, Germany

Copyright © 2025 Delaney, Clerkin, Karakolidis, Gilleece and Millar. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Emer Delaney, ZW1lci5kZWxhbmV5QGVyYy5pZQ==; Aidan Clerkin, YWlkYW4uY2xlcmtpbkBlcmMuaWU=

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.