Your new experience awaits. Try the new design now and help us make it even better

EDITORIAL article

Front. Educ.

Sec. Assessment, Testing and Applied Measurement

This article is part of the Research TopicHow do we collect all this data? A performative account of International Large-Scale Assessment data collection in times of systemic diversityView all 7 articles

Editorial: How did we collect all this data? A performative account of PIRLS data collection in challenging times

Provisionally accepted
  • 1University of Innsbruck, Innsbruck, Austria
  • 2International Association for the Evaluation of Educational Achievement, Amsterdam, Netherlands

The final, formatted version of the article will be published soon.

Participation in each cycle of assessment does not involve a mechanical copy or repetition of the study from one cycle to the next (van Staden and Combrinck, 2023). Instead, each new cycle is administered as a large-scale cross-sectional survey with country-level adjustments in terms of design and sampling that are closely monitored, with a view to comparability over time. These adjustments ensure that, as far as possible, the overall results accurately reflect the academic achievement of participating students and provide a (selective) picture of their national, school, classroom and home backgrounds for a particular cycle of participation.Reporting of standardised operating procedures, and the intention of highly standardised practices in administering ILSAs are well documented. So too are achievement reports, both nationally and internationally that provide a track record of countries' educational attainment (see for example Džumhur,et al., 2025). What is less pronounced for reporting purposes is the 'black box' of how the intention of standardised processes plays out for each participating country during data collection. The data for each country is subject to the specific conditions under which it was collected, i.e. the feasibility of nationally representative samples (strata, languages of testing, populations, etc), and questions to be asked to understand how accurately data reflects reality. However, our consideration is not primarily about the reliability, validity and credibility of ILSA data as a result of standardised procedures. Rather, we make an argument for the process of data collection that plays a pivotal role in translating high degrees of standardisation into implementation at each country level. In this way, data collection in and of itself tell us something about the context of the education system in which ILSAs are administered. This collection aims to describe country level data collection processes, challenges and considerations as graphically illustrated below: Two Southern Hemisphere perspectives from New Zealand and South Africa provide test administration insights. While the collection includes a conceptual analysis of the prospects to develop culturally inclusive models for education by Urrutia-Jorde, the New Zealand example (Chamberlain and Bennett) speaks to issues of cultural inclusivity by detailing decisions about languages of testing in small, marginalized populations. New Zealand's experience serves as an example of a post-colonial country that is negotiating the need and want to be educationally inclusive, but with questions about whether smaller populations meet ILSA's rigorous participation standards. The South African case by Roux highlights test administration from a developing, socio-economically, multicultural and multi-lingual context. Despite significant advancements in administrative capacity, stakeholder engagement, and methodological rigour, persistent issues such as resource constraints, infrastructural disparities, and lack of skilled capacity pose administration challenges.From the Northern Hemisphere, Ireland and Italy describe their investments in increasing response rates. The Irish perspective by Delaney et al. details efforts to promote participation and engagement through initiatives that places consultation, promotion, and support for studies like PIRLS as its centre. Italy's review of existing literature highlights student motivation as central to low stakes assessments, while their empirical experience points to customized communication strategies, logistical support, and privacy safeguards-particularly in engaging with parents-as essential for securing engagement and trust (Palmerio and Caponera). A contribution of original research from Latvia (Kampmane et al.) paints a picture of optimal conditions to ensure voluntary participation in controlled samples and data from this study make the observation that the teacher, principal and school coordinator motivation to complete questionnaires are highly affected by their perception of the meaningfulness and relevance of the survey to their day-to-day education activities. This collection of articles is based on the idea that international large-scale assessments (ILSA) can identify systemic problems and monitor and benchmark achievement, providing participating countries with accurate, valid and reliable data. Well-documented, standardised processes and procedures, that result in nationally representative achievement and contextual background data for participants, are of the utmost importance to achieve international comparability. Yet, the intention of standardized processes and procedures must be counter balanced with strong institutional support with thoughtful, flexible implementation strategies to maintain high levels of participation across all countries, sampled populations and groups of respondents.

Keywords: Data Collection, implementation, International large-scale assessments (ILSA), Progress in International Reading Literacy Study, Standardized procedures

Received: 09 Dec 2025; Accepted: 15 Dec 2025.

Copyright: © 2025 Van Staden, Kraler and Korsnakova. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Surette Van Staden

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.