AUTHOR=Seaborn Katie , Nakamura Satoshi TITLE=Quality and representativeness of research online with Yahoo! Crowdsourcing JOURNAL=Frontiers in Psychology VOLUME=Volume 16 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2025.1588579 DOI=10.3389/fpsyg.2025.1588579 ISSN=1664-1078 ABSTRACT=IntroductionConducting research online has become common in human participant research and notably in the field of human-computer interaction (HCI). Many researchers have used English-language and Western participant pool and recruitment platforms like Amazon Mechanical Turk and Prolific, with panel quality and representativeness known to vary greatly. Less is known about non-English, non-Western options. We consider Japan, a nation that produces a significant portion of HCI research. We report on an evaluation of the widely-used Yahoo! Crowdsourcing (YCS) recruitment platform.MethodsWe evaluated 65 data sets comprising N = 60, 681 participants, primarily focusing on the 42 data sets with complete meta data from studies requiring earnest participation (n = 29, 081).ResultsWe found generally high completion (77.6%) and retention rates (70.1%). Notably, use of multimedia stimuli exhibited higher completion (97.7%) and retention (91.9%) rates. We also found that the “general” participant setting attracted middle-aged men.DiscussionWe offer guidelines for best practice, such as online questionnaire design strategies to increase data quality and filtering to capture a more representative audience. We reveal the nature, power, and limitations of YCS for HCI and other fields conducting human participant research online.