Editors
4
Impact
Loading...
Article Cover Image
7,679 views
14 citations
7,551 views
14 citations
6,701 views
14 citations
Original Research
10 February 2021
Monitoring Systems for Checking Websites on Accessibility
Andreas Burkard
1 more and 
Bettina Schwarzer

Web accessibility monitoring systems support users in checking entire websites for accessibility issues. Although these tools can only check the compliance with some of the many success criteria of the Web Content Accessibility Guidelines, they can assist quality assurance personnel, web administrators and web authors to discover hotspots of barriers and overlooked accessibility issues in a continuous manner. These tools should be effective in identifying accessibility issues. Furthermore, they should motivate users, as this promotes employee productivity and increases interest in accessibility in general. In a comparative study, we applied four commercial monitoring systems on two of the Stuttgart Media University’s websites. The tools are: 1) The Accessibility module of Siteimprove from Siteimprove, 2) Pope Tech from Pope Tech, 3) WorldSpace Comply (now called axe Monitor) from Deque, and 4) ARC Monitoring from The Paciello Group. The criteria catalogue consists of functional criteria that we gleaned from literature and user experience criteria based on the User Experience Questionnaire. Based on a focus group consisting of experts of Stuttgart Media University, we derived individual weights for the criteria. The functional evaluation criteria are: Coverage of the website and the guidelines, completeness, correctness, support in locating errors, support for manual checks, degree of implementing gamification patterns, support for various input and report formats, and methodological support for the Website Accessibility Conformance Evaluation Methodology 1.0 and for the German procurement law for public authorities Barrierefreie Informationstechnik-Verordnung 2.0. For determination of the user experience criteria, we conducted exploratory think-aloud user tests (n = 15) using a coaching approach. Every participant tested all tools for 15 min (within-subject design). The participants completed post-test questionnaires, including the User Experience Questionnaire. According to our results, Siteimprove turned out to be the best tool for our purposes.

8,339 views
15 citations
Open for submission
Frontiers Logo

Frontiers in Computer Science

Human Factors and Design in Immersive and Generative Media Technologies
Edited by Tze Wei Liew, Faizan Ahmad, Koen Smit, Serban Georgica Obreja
Deadline
02 September 2025
Submit a paper
Recommended Research Topics
Frontiers Logo

Frontiers in Robotics and AI

Human Movement Understanding for Intelligent Robots and Systems
Edited by TAIZO YOSHIKAWA, Emel Demircan, Philippe Fraisse, Tadej Petric
72.6K
views
17
authors
7
articles
42.5K
views
18
authors
6
articles
Frontiers Logo

Frontiers in Psychology

Human-Computer Interaction Serious Games as Behavioral Change Moderators
Edited by Sofia Balula Dias, Leontios J Hadjileontiadis, Herbert F Jelinek, Jose Alves Diniz
51.1K
views
36
authors
9
articles
Frontiers Logo

Frontiers in Computer Science

Advancing Digital Accessibility in Academic and Workplace Education
Edited by Sarah Horton, Gottfried Zimmermann, Yasmine N. Elglaly, Andy Coverdale, Scott Hollier, Sarah Lewthwaite, Kate Sonka
23K
views
19
authors
6
articles
Frontiers Logo

Frontiers in Computer Science

New Advances and Novel Applications of Music Technologies for Health, Well-Being, and Inclusion
Edited by Emma Margareta Frid, Kjetil Falkenberg, Kat Agres
33.9K
views
27
authors
10
articles