<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. ICT</journal-id>
<journal-title>Frontiers in ICT</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. ICT</abbrev-journal-title>
<issn pub-type="epub">2297-198X</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fict.2017.00008</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>ICT</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Computer Screen Use Detection Using Smart Eyeglasses</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Wahl</surname> <given-names>Florian</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="corresp" rid="cor1">&#x0002A;</xref>
<uri xlink:href="http://frontiersin.org/people/u/371375"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Kasbauer</surname> <given-names>Jakob</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<uri xlink:href="http://frontiersin.org/people/u/376880"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Amft</surname> <given-names>Oliver</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<uri xlink:href="http://frontiersin.org/people/u/192147"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>ACTLab, Chair of Sensor Technology, University of Passau</institution>, <addr-line>Passau</addr-line>, <country>Germany</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Koichi Kise, Osaka Prefecture University, Japan</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Thuong Hoang, University of Melbourne, Australia; Kiyoshi Kiyokawa, Osaka University, Japan; Takashi Miyaki, Tokyo University, Japan</p></fn>
<corresp content-type="corresp" id="cor1">&#x0002A;Correspondence: Florian Wahl, <email>wahl&#x00040;fim.uni-passau.de</email></corresp>
<fn fn-type="other" id="fn001"><p>Specialty section: This article was submitted to Mobile and Ubiquitous Computing, a section of the journal Frontiers in ICT</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>05</day>
<month>05</month>
<year>2017</year>
</pub-date>
<pub-date pub-type="collection">
<year>2017</year>
</pub-date><volume>4</volume>
<elocation-id>8</elocation-id>
<history>
<date date-type="received">
<day>31</day>
<month>08</month>
<year>2016</year>
</date>
<date date-type="accepted">
<day>13</day>
<month>04</month>
<year>2017</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2017 Wahl, Kasbauer and Amft.</copyright-statement>
<copyright-year>2017</copyright-year>
<copyright-holder>Wahl, Kasbauer and Amft</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract>
<p>Screen use can influence the circadian phase and cause eye strain. Smart eyeglasses with an integrated color light sensor can detect screen use. We present a screen use detection approach based on a light sensor embedded into the bridge of smart eyeglasses. By calculating the light intensity at the user&#x02019;s eyes for different screens and content types, we found only computer screens to have a significant impact on the circadian phase. Our screen use detection is based on ratios between color channels and used a linear support vector machine to detect screen use. We validated our detection approach in three studies. A test bench was built to detect screen use under different ambient light sources and intensities in a controlled environment. In a lab study, we evaluated recognition performance for different ambient light intensities. By using participant-independent models, we achieved an ROC AUC above 0.9 for ambient light intensities below 200&#x02009;lx. In a study of typical ADLs, screen use was detected with an average ROC AUC of 0.83 assuming screen use for 30% of the time.</p>
</abstract>
<kwd-group>
<kwd>eyewear</kwd>
<kwd>circadian clock</kwd>
<kwd>wearable sensor</kwd>
<kwd>activities of daily living</kwd>
<kwd>glasses</kwd>
<kwd>activity recognition</kwd>
<kwd>eye strain</kwd>
</kwd-group>
<counts>
<fig-count count="9"/>
<table-count count="4"/>
<equation-count count="4"/>
<ref-count count="32"/>
<page-count count="12"/>
<word-count count="8948"/>
</counts>
</article-meta>
</front>
<body>
<sec id="S1" sec-type="introduction">
<label>1</label> <title>Introduction</title>
<p>People spend a large portion of the day looking at computer, television, or tablet screens. For example, in 2015, the average US adult spent 9&#x02009;h and 52&#x02009;min in front of different screens every day (eMarketer, <xref ref-type="bibr" rid="B12">2016</xref>). Many office workers spend more than 6&#x02009;h in front of a computer screen (PresseBox, <xref ref-type="bibr" rid="B21">2008</xref>). Extended screen use often causes eye strain, the most common repetitive strain injury. For example, in the United States, 65% of the population suffer from eye strain (The Vision Council of America, <xref ref-type="bibr" rid="B25">2016</xref>). In addition, screen use can influence the circadian phase. The circadian clock is entrained by timing and intensity of light exposure. Morning light exposure advances circadian clock phase, and light exposure in the evening delays it (Revell and Eastman, <xref ref-type="bibr" rid="B22">2005</xref>). The circadian system is most sensitive in the blue range of the light spectrum (Brainard et al., <xref ref-type="bibr" rid="B6">2001</xref>). LED-backlit screens emit high energy in the blue light spectrum compared to other wavelengths and other indoor light sources. Thus, screen use could shift circadian phase (Cajochen et al., <xref ref-type="bibr" rid="B7">2011</xref>), leading to, e.g., difficulty to fall asleep at night and consequently sleep deprivation. A fundamental requirement for guidance and intervention to prevent eye- or sleep-related health problems is therefore to detect screen use when it actually occurs.</p>
<p>Ambient sensors, e.g., proximity sensors (Jaramillo-Garcia et al., <xref ref-type="bibr" rid="B16">2014</xref>), do not suffice for screen use detection as they cannot distinguish mere presence from looking at a screen. To detect screen use, light measurements have to be taken as close as possible to the eye. Only few head-worn wearables have been proposed for context recognition that could be used to robustly detect light received at the eyes. Eyeglasses appear to be practical, everyday accessories and could house a light sensor, without changing their main function as eyeglasses or substantially modifying the eyeglasses appearance. Considering the sensor&#x02019;s typical field of view of 60&#x000B0;, the best measurement of light actually received at the eyes may be between the eyes, i.e., at the eyeglasses bridge.</p>
<p>Light reaching the eyes originates from a mixture of sources, which differ in the intensity and spectral distribution. Common light intensities range over several orders of magnitude, e.g., from 500&#x02009;lx at a office desk to 100,000&#x02009;lx outside on a sunny day. When exposed to large amounts (above 1,000&#x02009;lx) of natural or artificial light, screens may contribute a negligible share. Consequently, screen use matters if there is only dim ambient light, e.g., during evening or night hours when ambient light intensity is low. Due to the relevance of blue light energy of LED-based screens, spectral irradiance patterns could help to discriminate screen use from other light sources. Thus, a spectral decomposition of the incident light measurement is required at the sensor. The detection algorithm has to cope with noise added by head motion. In addition, a broad range of screens, content types, and ambient lighting conditions complicate the detection of screen use as screen light emissions vary.</p>
<p>In this article, we introduce an approach to detect screen use with smart eyeglasses that provide a color light sensor embedded into the eyeglasses bridge. Electronics to store, process, and transmit measured light data were integrated into the eyeglasses frame. Our approach involves three steps to investigate the challenges related to screen use recognition: a test bench environment was used to investigate screen use detection in a fully controlled environment, ambient light intensity, and without head motion. In a lab study, screen use was investigated with participants wearing smart eyeglasses in different ambient light intensities. Finally, a study of typical activities of daily living (ADLs) evaluated detection of screen use in unconstrained daily life situations. By using features derived from the ratios between color channels, we detect when screen use occurs.</p>
<p>In particular, this article provides the following contributions:
<list list-type="order">
<list-item><p>We analyzed screens, content type, and typical viewing distance to derive device-specific light intensity at the user&#x02019;s eyes. We found that computer screens provide highest light intensities due to their size, displayed content, and typical viewing distance.</p></list-item>
<list-item><p>We analyzed screen use detection performance under different ambient light sources and intensities in a test bench environment, which indicated a perfect screen use detection.</p></list-item>
<list-item><p>We evaluated our screen use detection approach using smart eyeglasses worn by 14 participants in a lab study. For light intensities below 200&#x02009;lx, an ROC AUC of above 0.9 was reached using participant-independent models.</p></list-item>
<list-item><p>We applied our approach to data recorded in a study investigating different ADLs. In seven participants, we detected screen use with an average ROC AUC of 0.83 when assuming 30% screen use.</p></list-item>
</list></p>
</sec>
<sec id="S2">
<label>2</label> <title>Related Work</title>
<p>According to Duffy and Wright (<xref ref-type="bibr" rid="B11">2005</xref>), light is the dominant factor in entrainment of the circadian rhythm. Phase shifts of &#x02212;2 to &#x0002B;3&#x02009;h per day are possible depending on the intensity and timing of light exposure. The authors report that a misaligned circadian clock can lead to impaired performance, alertness, and upset gastrointestinal functions. Kantermann and Roenneberg (<xref ref-type="bibr" rid="B17">2009</xref>) found that light during night can damage DNA, and thus lead to cancer putting those in shift work especially at risk. Brainard et al. (<xref ref-type="bibr" rid="B6">2001</xref>) found spectral sensitivity of the circadian system to peak at 464&#x02009;nm wavelength. Being exposed to blue light in the evening delays the production of melatonin and thus delays circadian phase. Screen use potentially causes circadian phase shift, e.g., when working late nights in front of a computer screen. As misaligned clocks negatively influence health, episodes of screen use could be detected automatically to support behavioral change.</p>
<p>The effect of screen use on the circadian rhythm has been analyzed in several studies. Cajochen et al. (<xref ref-type="bibr" rid="B7">2011</xref>) found a significant delay in dim light melatonin onset (DLMO) from evening exposure to LED-backlit computer screens compared to cold cathode fluorescent lamp (CCFL)-backlit computer screens. Prior to their regular bedtime, 13 participants performed a 5-h screen use episode during which melatonin was sampled every 30&#x02009;min. Wood et al. (<xref ref-type="bibr" rid="B31">2013</xref>) conducted an experiment on 12 participants using a tablet 2&#x02009;h prior to bedtime. They found that melatonin was suppressed by 23% on average after 2&#x02009;h of tablet use in the evening compared to no tablet use. After 1&#x02009;h of tablet use, the effect was smaller with only 7% melatonin suppression compared to no tablet use. Chang et al. (<xref ref-type="bibr" rid="B9">2015</xref>) investigated the influence of using light-emitting eReaders at night instead of reading print books. In their crossover protocol, 12 participants read for 4&#x02009;h prior to bedtime during five consecutive evenings per device. They found that DLMO was delayed by more than 1.5&#x02009;h when using the backlit eReader over the paper book. These findings confirm circadian phase shift effects through screen use in the evening. Thus, an automatic detection of screen use could inform users about their behavior and support them in implementing an effective compensation.</p>
<p>Studies investigating light influences on human physiology often used wrist-worn devices to record light exposure. Wahl et al. (<xref ref-type="bibr" rid="B30">2014</xref>) compared wrist- and head-worn light sensors and found substantial differences in measured light intensities. One core problem with wrist-worn light measurements was a frequent occlusion of the sensor by long-sleeve clothing. Another issue is that light sensor sensitivity depends on angular displacement, typically 50% sensitivity at &#x000B1;60&#x000B0; displacement, which requires sensors to be worn in an orientation similar to the eyes. Others used head-worn devices such as the Daysimeter (Bierman et al., <xref ref-type="bibr" rid="B5">2005</xref>), which offers accurate recordings but were found impractical for continuous use in everyday life due to their form factor and clip-on to eyeglasses. Regular eyeglasses are the most common vision aid, worn by many. For example, in Germany, 63.5% of the population above 16&#x02009;years and 92% above 60&#x02009;years wear eyeglasses (Institut f&#x000FC;r Demoskopie Allensbach, <xref ref-type="bibr" rid="B15">2014</xref>). In our previous work, we embedded a multimodal, multipurpose sensing system into regular eyeglasses termed WISEglass. WISEglass was used for a wide variety of sensing applications ranging from daily activity recognition (Wahl et al., <xref ref-type="bibr" rid="B29">2015c</xref>), dietary monitoring (Zhang et al., <xref ref-type="bibr" rid="B32">2016</xref>) to motion-based video game control (Wahl et al., <xref ref-type="bibr" rid="B27">2015a</xref>). The smart eyeglasses were validated in a study on 9 participants performing typical ADLs (Wahl et al., <xref ref-type="bibr" rid="B28">2015b</xref>,<xref ref-type="bibr" rid="B29">c</xref>). Nine activity clusters were detected from accelerometer and gyroscope data with 77% accuracy. Screen use detection using a color light sensor showed a mean accuracy of 80%. This article provides an in-depth evaluation of screen use detection with color light sensors in three subsequent studies. We investigate different screen types, ambient light sources, and content types.</p>
<p>Intervention measures can be applied to minimize influence of screen use on the circadian phase. Heath et al. (<xref ref-type="bibr" rid="B14">2014</xref>) studied if tablet use 1&#x02009;h prior to bedtime influences the circadian phase. In their study on 16 participants, unfiltered light was compared to light filtered with the f.lux application. They could not find sound proof that 1&#x02009;h of tablet screen had a significant impact on the circadian phase. van der Lely et al. (<xref ref-type="bibr" rid="B26">2015</xref>) investigated the effect of blue light-blocking glasses on the circadian phase. In 13 young males who spent 3&#x02009;h prior to going to bed in front of a computer screen for 1&#x02009;week each, they found a significant increase in melatonin when using blue light-blocking glasses. There are suitable intervention measures based on either software or hardware to minimize the effect of screen use on the circadian phase. The work presented in this article could support users by reminding them to wear blue light-blocking glasses or (automatically) enabling a software-based intervention to screen light profile and light intensity.</p>
</sec>
<sec id="S3">
<label>3</label> <title>Light Monitoring in Eyeglasses</title>
<p>The WISEglass project aims to embed a multimodal sensor system in regular eyeglasses that could be worn as an everyday accessory, just like eyeglasses are worn today. Our first prototype was based on regular eyeglasses and is shown in Figure <xref ref-type="fig" rid="F1">1</xref>A.</p>
<fig position="float" id="F1">
<label>Figure 1</label>
<caption><p><bold>Eyeglasses prototypes used in this work</bold>. The eyeglasses prototypes integrate a color light sensor with red, green, blue, and clear channels into the eyeglasses bridge. In the prototype shown in panel <bold>(B)</bold>, electronics for recording, transmitting, and storing sensor data were integrated into the eyeglasses frame. <bold>(A)</bold> First prototype based on regular spectacles. <bold>(B)</bold> 3D printed prototype.</p></caption>
<graphic xlink:href="fict-04-00008-g001.tif"/>
</fig>
<p>Subsequently, a 3D printed version was built as shown in Figure <xref ref-type="fig" rid="F1">1</xref>B. Both prototypes were used for the analyses presented in this article. We added compartments inside the temple ends of the eyeglasses to house the baseboard and battery. Compared to our first prototype, the compartments improve balancing the component weight. The compartments disappear behind a wearer&#x02019;s ears when attached; thus, WISEglass appears as regular eyeglasses to bystanders. We included a small compartment on the bridge of the eyeglasses to embed a color light sensor. The bridge location can be used to unobtrusively measure light. 3D printed eyeglasses could be fitted to the wearers head to achieve similar wearing comfort as in existing eyeglasses models.</p>
<p>Integrated sensors are controlled and sampled by the on-board microcontroller. Data can be stored in flash memory for later download or streamed directly via Bluetooth Smart. The battery lasts for 32&#x02009;h when sampling light sensor data at 50&#x02009;Hz. Sampling rates can be configured depending on the application.</p>
</sec>
<sec id="S4" sec-type="methods">
<label>4</label> <title>Evaluation Methodology</title>
<p>In this section, we describe goals, sensing approach, recording protocol, and evaluation methodology for each of the three analysis studies.</p>
<p>Starting from the illuminance at the screen during PC work and TV use, we calculated the light intensity at the user&#x02019;s eyes for different devices as described in Section <xref ref-type="sec" rid="S4-1">4.1</xref>. A test bench was used to derive a baseline of the irradiance provided by screens at regular viewing distance and under controlled environmental lighting conditions. The test bench analysis is further described in Section <xref ref-type="sec" rid="S4-2">4.2</xref>. In a lab study (described in Section <xref ref-type="sec" rid="S4-3">4.3</xref>), data were collected using the same desk setup and viewing distances for every participant while ambient light intensity varied due to weather and time of the day. The ADL study (Section <xref ref-type="sec" rid="S4-4">4.4</xref>) investigates a natural variability of screen use, where activities were suggested, but the execution left to participants.</p>
<p>In this work, we obtain data from a TCS34725 (ams AG, <xref ref-type="bibr" rid="B2">2013</xref>) light sensor made by ams. Light was measured in four different spectra: red, green, blue, and clear. After downloading the raw data, virtual light channels were computed from multiple raw color channels by deriving channel ratios. As an example, the ratio of the blue to clear light channel expresses how much blue light is measured in relation to the total amount of light.</p>
<sec id="S4-1">
<label>4.1</label> <title>Light Intensity Calculations</title>
<p>The light intensity at the user&#x02019;s eyes is necessary to quantify the impact of screen use on the circadian phase. We calculated the light intensity at the user&#x02019;s eyes for different device types during PC and TV use at typical viewing distances. The area A was computed from the aspect ratio <italic>w</italic>:<italic>h</italic> and the diagonal length <italic>l</italic> of the screen as
<disp-formula id="E1"><label>(1)</label><mml:math id="M1"><mml:mi>A</mml:mi><mml:mo class="MathClass-rel">&#x0003D;</mml:mo><mml:mfrac><mml:mrow><mml:mi>w</mml:mi><mml:mo class="MathClass-bin">&#x022C5;</mml:mo><mml:mi>h</mml:mi><mml:mo class="MathClass-bin">&#x022C5;</mml:mo><mml:msup><mml:mrow><mml:mi>l</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow><mml:mrow><mml:msup><mml:mrow><mml:mi>w</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo class="MathClass-bin">&#x0002B;</mml:mo><mml:msup><mml:mrow><mml:mi>h</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:mfrac><mml:mo class="MathClass-punc">.</mml:mo></mml:math></disp-formula></p>
<p>The luminance <italic>L</italic> for each screen was researched online and used to compute luminous intensity <italic>I</italic> as
<disp-formula id="E2"><label>(2)</label><mml:math id="M2"><mml:mi>I</mml:mi><mml:mo class="MathClass-rel">&#x0003D;</mml:mo><mml:mi>L</mml:mi><mml:mo class="MathClass-bin">&#x022C5;</mml:mo><mml:mi>A</mml:mi><mml:mo class="MathClass-punc">.</mml:mo></mml:math></disp-formula></p>
<p>Different screen activities yield different light intensity levels at the user&#x02019;s eyes. During high-illuminance screen activities, e.g., editing documents at a PC screen, a majority of pixels is typically bright. In contrast, low-illuminance screen activities, e.g., watching a movie, feature dark backgrounds. To reflect screen activity-dependent light intensity differences, we used two activity prototypes to describe typical screen activities: PC work for high-illuminance screen activities and watching TV for low-illuminance screen activities. To incorporate the effect of content into the textbook irradiance computations, we added the content factor <italic>&#x003BC;<sub>PC</sub></italic> and <italic>&#x003BC;<sub>TV</sub></italic> in equations (<xref ref-type="disp-formula" rid="E3">3</xref>) and (<xref ref-type="disp-formula" rid="E4">4</xref>), respectively. Our test bench measurements found TV content to produce one-third of the light intensity at the user&#x02019;s eyes compared to PC content. Content factors were therefore set to <italic>&#x003BC;<sub>PC</sub></italic>&#x02009;&#x0003D;&#x02009;1 and <inline-formula><mml:math id="M3"><mml:msub><mml:mrow><mml:mi>&#x003BC;</mml:mi></mml:mrow><mml:mrow><mml:mi mathvariant="italic">TV</mml:mi></mml:mrow></mml:msub><mml:mo class="MathClass-rel">&#x0003D;</mml:mo><mml:mfrac><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:mfrac></mml:math></inline-formula>. Finally, we used a typical viewing distance <italic>d</italic> for each device to compute light intensity at the user&#x02019;s eyes during PC use <italic>E<sub>PC</sub></italic> and TV use <italic>E<sub>TV</sub></italic> as
<disp-formula id="E3"><label>(3)</label><mml:math id="M4"><mml:msub><mml:mrow><mml:mi>E</mml:mi></mml:mrow><mml:mrow><mml:mi mathvariant="italic">PC</mml:mi></mml:mrow></mml:msub><mml:mo class="MathClass-rel">&#x0003D;</mml:mo><mml:mfrac><mml:mrow><mml:mi>I</mml:mi></mml:mrow><mml:mrow><mml:msup><mml:mrow><mml:mi>d</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:mfrac><mml:mo class="MathClass-bin">&#x022C5;</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003BC;</mml:mi></mml:mrow><mml:mrow><mml:mi mathvariant="italic">PC</mml:mi></mml:mrow></mml:msub><mml:mo class="MathClass-punc">,</mml:mo></mml:math></disp-formula>
<disp-formula id="E4"><label>(4)</label><mml:math id="M5"><mml:msub><mml:mrow><mml:mi>E</mml:mi></mml:mrow><mml:mrow><mml:mi mathvariant="italic">TV</mml:mi></mml:mrow></mml:msub><mml:mo class="MathClass-rel">&#x0003D;</mml:mo><mml:mfrac><mml:mrow><mml:mi>I</mml:mi></mml:mrow><mml:mrow><mml:msup><mml:mrow><mml:mi>d</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:mfrac><mml:mo class="MathClass-bin">&#x022C5;</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003BC;</mml:mi></mml:mrow><mml:mrow><mml:mi mathvariant="italic">TV</mml:mi></mml:mrow></mml:msub><mml:mo class="MathClass-punc">.</mml:mo></mml:math></disp-formula></p>
<p>For the light intensity analysis, the distance <italic>d</italic> between screen and eyes is essential. However, viewing distances differ per device type. We used the following viewing distance settings in this analysis. Smartphone users maintain an average viewing distance of 32.2&#x02009;cm when web browsing (Bababekova et al., <xref ref-type="bibr" rid="B3">2011</xref>). Tablets and eBook readers are typically used at 50&#x02009;cm distance (Shieh and Lee, <xref ref-type="bibr" rid="B23">2007</xref>; Campbell, <xref ref-type="bibr" rid="B8">2013</xref>). For PC use, the German Social Accident Insurance (DGUV) recommends a viewing distance of 50 and 65&#x02009;cm (Deutsche Gesetzliche Unfallversicherung e.V. (DGUV), <xref ref-type="bibr" rid="B10">2015</xref>). In this investigation, we assumed 50&#x02009;cm for notebooks and 60&#x02009;cm for desktop PC use. Investigations on TV use found the average viewing distance to be above 200&#x02009;cm (Nathan et al., <xref ref-type="bibr" rid="B19">1985</xref>; Lee, <xref ref-type="bibr" rid="B18">2012</xref>).</p>
</sec>
<sec id="S4-2">
<label>4.2</label> <title>Test Bench</title>
<p>We built a test bench to analyze light irradiance from screens under controlled ambient lighting conditions. The test bench included four different types of dimmable ambient light sources. Movie or PC use content was displayed on a computer screen to simulate different content types. Light irradiance was measured at a distance of 50&#x02009;cm, where the recommended distance for screen use at the workplace is between 50 and 65&#x02009;cm (Deutsche Gesetzliche Unfallversicherung e.V. (DGUV), <xref ref-type="bibr" rid="B10">2015</xref>).</p>
<sec id="S4-2-1">
<label>4.2.1</label> <title>Sensing Approach</title>
<p>The test bench was constructed out of wooden boards and aluminum profiles forming a cuboid of 200&#x02009;cm height with a footprint of 100&#x02009;cm&#x02009;&#x000D7;&#x02009;100&#x02009;cm. The floor panel was mounted at a height of 40&#x02009;cm from the ground leaving a distance from floor to ceiling of 160&#x02009;cm. The height resembles the typical distance between table surface and ceiling lamps in office spaces. The screen was mounted at the rear end of the floor panel. Irradiance was measured at 50&#x02009;cm distance, resembling the typical viewing distance of computer screens. A model of the test bench is shown in Figure <xref ref-type="fig" rid="F2">2</xref>A.</p>
<fig position="float" id="F2">
<label>Figure 2</label>
<caption><p><bold>Test bench analysis</bold>. Different ambient light sources were mounted on the ceiling of the construction. On the floor panel screen use was simulated on a 24&#x02033; IPS LED-backlit screen. Four color light sensors were placed at 50&#x02009;cm distance from the screen. Light irradiance reference measurements were recorded using a spectral irradiance meter. <bold>(A)</bold> 3D model of the test bench. <bold>(B)</bold> Schematic of device arrangement in the horizontal plane at the floor panel.</p></caption>
<graphic xlink:href="fict-04-00008-g002.tif"/>
</fig>
<p>Initially, we analyzed the spectral irradiance of a typical LED-backlit screen. Here, we used the SpectraRad Xpress BSR112E-VIS miniature spectral irradiance meter by BWTEK. The recorded spectrum ranges from 380 to 750&#x02009;nm wavelength with a resolution of 3&#x02009;nm. Connected to a PC running the BWSpec software, samples were recorded at 26.3&#x02009;Hz. Spectral measurements were averaged per second resulting in a sampling rate of 1&#x02009;Hz. Dark calibration was conducted prior to recordings.</p>
<p>For test bench recordings, an HP EliteDisplay E241i 24&#x02033; IPS panel LED-backlit screen was used. The screen configuration was reset to factory default prior to recordings. Figure <xref ref-type="fig" rid="F3">3</xref> depicts the spectrum of the screen with all pixels set to white. The three spectral peaks reflect the composition of the three primary colors: red, green, and blue. The intense blue peak is the characteristic for LED-backlit screens (Cajochen et al., <xref ref-type="bibr" rid="B7">2011</xref>).</p>
<fig position="float" id="F3">
<label>Figure 3</label>
<caption><p><bold>Test bench analysis</bold>. Spectrum of an HP EliteDisplay E241i 24&#x02033; LED-backlit IPS panel displaying a full white image. The spectrum was computed by averaging 26 samples from the spectral irradiance meter. It can be seen that the blue spectral component is more prominent than the green and red components.</p></caption>
<graphic xlink:href="fict-04-00008-g003.tif"/>
</fig>
<p>Subsequently, we investigated the screen irradiance measured by a typical commercial color light sensor under different lighting conditions. Data from four TCS34725 (ams AG, <xref ref-type="bibr" rid="B2">2013</xref>) color light sensors, the same as used in WISEglass, were recorded. We used four sensors to accommodate for intersensor variation. Each light sensor was connected to an Arduino via I<sup>2</sup>C bus and triggered an interrupt when a new measurement was ready. Sensor integration time was set to 50&#x02009;ms resulting in a sampling rate of 20&#x02009;Hz. Arduinos were connected to a computer running the CRNT software to record data (Bannach et al., <xref ref-type="bibr" rid="B4">2008</xref>). Sensor gain was set to 60&#x000D7; resulting in a 27.6% saturation of the clear light channel at 1,000&#x02009;lx indoors, a typical indoor light intensity on a sunny day. Figure <xref ref-type="fig" rid="F2">2</xref>B shows the installation on the floor panel of the test bench.</p>
<p>Three lamp sockets were mounted at the ceiling of the test bench for LED lamps, halogen lamps, and fluorescent lamps. We evaluated screen use detection using four different artificial light sources described in Table <xref ref-type="table" rid="T1">1</xref>. The spectra of the lamps at full brightness are shown in Figure <xref ref-type="fig" rid="F4">4</xref>.</p>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p><bold>Test bench analysis</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">No.</th>
<th align="left">Lamp type</th>
<th align="center">Color temperature (K)</th>
<th align="center">Energy consumption (W)</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">1</td>
<td align="left">Halogen</td>
<td align="center">3,000</td>
<td align="center">35</td>
</tr>
<tr>
<td align="left">2</td>
<td align="left">LED</td>
<td align="center">2,700</td>
<td align="center">5.9</td>
</tr>
<tr>
<td align="left">3</td>
<td align="left">Cold CCFL</td>
<td align="center">8,000</td>
<td align="center">18</td>
</tr>
<tr>
<td align="left">4</td>
<td align="left">Warm CCFL</td>
<td align="center">2,700</td>
<td align="center">18</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>Artificial light sources and their core properties according to manufacturer specifications</italic>.</p>
</table-wrap-foot>
</table-wrap>
<fig position="float" id="F4">
<label>Figure 4</label>
<caption><p><bold>Test bench analysis</bold>. Spectra of the four different ambient light sources used. Spectral distribution differs per lamp type. CCFL lamps show few, relatively high-energy peaks. <bold>(A)</bold> Halogen (1), <bold>(B)</bold> LED (2), <bold>(C)</bold> cold CCFL (3), and <bold>(D)</bold> warm CCFL (4).</p></caption>
<graphic xlink:href="fict-04-00008-g004.tif"/>
</fig>
</sec>
<sec id="S4-2-2">
<label>4.2.2</label> <title>Recording Protocol</title>
<p>The test bench recording protocol contained three independent variables: screen state <italic>S</italic>, lamp type <italic>T</italic>, and lamp intensity <italic>D</italic>.</p>
<p>Screen state was set to one of the three following states: (1) off for not screen use <italic>S</italic>&#x02009;&#x0003D;&#x02009;<italic>off</italic>, (2) displaying TV content <italic>S</italic>&#x02009;&#x0003D;&#x02009;<italic>tv</italic>, or (3) displaying PC use content <italic>S</italic>&#x02009;&#x0003D;&#x02009;<italic>pc</italic>. To switch between <italic>S</italic>&#x02009;&#x0003D;&#x02009;<italic>off</italic> and <italic>S</italic>&#x02009;&#x0003D;&#x02009;<italic>tv, pc</italic> a <italic>xrandr</italic> script was used. The James Bond Spectre movie trailer was used to simulate TV content.<xref ref-type="fn" rid="fn1"><sup>1</sup></xref> To simulate PC content, a video tutorial on iPython notebooks was played.<xref ref-type="fn" rid="fn2"><sup>2</sup></xref> Videos were played in an endless loop.</p>
<p>The lamp type was either one of the four different lamps listed in Table <xref ref-type="table" rid="T1">1</xref> or darkness: <italic>T</italic>&#x02009;&#x0003D;&#x02009;{<italic>Halogen</italic>, <italic>LED</italic>, <italic>Cold CCFL</italic>, <italic>Warm CCFL</italic>, <italic>dark</italic>}. Lamp intensity was set to <italic>D</italic>&#x02009;&#x0003D;&#x02009;{<italic>100%</italic> &#x02026;<italic>10%</italic>} in steps of 10%. We recorded 10&#x02009;min of data for a total of <italic>S</italic>&#x02009;&#x000D7;&#x02009;<italic>T</italic>&#x02009;&#x000D7;&#x02009;<italic>D</italic>&#x02009;&#x0003D;&#x02009;141 states (<italic>T</italic>&#x02009;&#x0003D;&#x02009;<italic>dark</italic> has no steps for <italic>D</italic>).</p>
</sec>
<sec id="S4-2-3">
<label>4.2.3</label> <title>Evaluation Procedure</title>
<p>Starting from light measurements, we computed ratios between multiple color light sensor channels to become independent from absolute values. Each ratio represents one virtual light channel. Virtual light channels put each color channel in relation to others. For example, the virtual channel <inline-formula><mml:math id="M6"><mml:mfrac><mml:mrow><mml:mi mathvariant="italic">blue</mml:mi></mml:mrow><mml:mrow><mml:mi mathvariant="italic">clear</mml:mi></mml:mrow></mml:mfrac></mml:math></inline-formula> indicates the ratio of blue light vs. overall light intensity. We computed the virtual light channels <inline-formula><mml:math id="M7"><mml:mfrac><mml:mrow><mml:mrow><mml:mo class="MathClass-open">&#x0007B;</mml:mo><mml:mrow><mml:mi>R</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>G</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>B</mml:mi></mml:mrow><mml:mo class="MathClass-close">&#x0007D;</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi>C</mml:mi></mml:mrow></mml:mfrac></mml:math></inline-formula>, <inline-formula><mml:math id="M8"><mml:mfrac><mml:mrow><mml:mi>R</mml:mi></mml:mrow><mml:mrow><mml:mi>G</mml:mi></mml:mrow></mml:mfrac></mml:math></inline-formula>, <inline-formula><mml:math id="M9"><mml:mfrac><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mrow><mml:mi>R</mml:mi></mml:mrow></mml:mfrac></mml:math></inline-formula>, <inline-formula><mml:math id="M10"><mml:mfrac><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mrow><mml:mi>G</mml:mi></mml:mrow></mml:mfrac></mml:math></inline-formula>, <inline-formula><mml:math id="M11"><mml:mfrac><mml:mrow><mml:mi>C</mml:mi><mml:mo class="MathClass-bin">&#x02212;</mml:mo><mml:mrow><mml:mo class="MathClass-open">&#x0007B;</mml:mo><mml:mrow><mml:mi>R</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>G</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>B</mml:mi></mml:mrow><mml:mo class="MathClass-close">&#x0007D;</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mrow><mml:mo class="MathClass-open">&#x0007B;</mml:mo><mml:mrow><mml:mi>R</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>G</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>B</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>C</mml:mi></mml:mrow><mml:mo class="MathClass-close">&#x0007D;</mml:mo></mml:mrow></mml:mrow></mml:mfrac></mml:math></inline-formula>, <inline-formula><mml:math id="M12"><mml:mfrac><mml:mrow><mml:mi mathvariant="italic">log</mml:mi><mml:mrow><mml:mo class="MathClass-open">(</mml:mo><mml:mrow><mml:mi>C</mml:mi><mml:mo class="MathClass-bin">&#x02212;</mml:mo><mml:mrow><mml:mo class="MathClass-open">&#x0007B;</mml:mo><mml:mrow><mml:mi>R</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>G</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>B</mml:mi></mml:mrow><mml:mo class="MathClass-close">&#x0007D;</mml:mo></mml:mrow></mml:mrow><mml:mo class="MathClass-close">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi mathvariant="italic">log</mml:mi><mml:mrow><mml:mo class="MathClass-open">(</mml:mo><mml:mrow><mml:mrow><mml:mo class="MathClass-open">&#x0007B;</mml:mo><mml:mrow><mml:mi>R</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>G</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>B</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>C</mml:mi></mml:mrow><mml:mo class="MathClass-close">&#x0007D;</mml:mo></mml:mrow></mml:mrow><mml:mo class="MathClass-close">)</mml:mo></mml:mrow></mml:mrow></mml:mfrac></mml:math></inline-formula>, <inline-formula><mml:math id="M13"><mml:mfrac><mml:mrow><mml:mi mathvariant="italic">log</mml:mi><mml:mrow><mml:mo class="MathClass-open">(</mml:mo><mml:mrow><mml:mrow><mml:mo class="MathClass-open">&#x0007B;</mml:mo><mml:mrow><mml:mi>R</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>G</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>B</mml:mi></mml:mrow><mml:mo class="MathClass-close">&#x0007D;</mml:mo></mml:mrow></mml:mrow><mml:mo class="MathClass-close">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi mathvariant="italic">log</mml:mi><mml:mrow><mml:mo class="MathClass-open">(</mml:mo><mml:mrow><mml:mi>C</mml:mi></mml:mrow><mml:mo class="MathClass-close">)</mml:mo></mml:mrow></mml:mrow></mml:mfrac></mml:math></inline-formula>, <inline-formula><mml:math id="M14"><mml:mfrac><mml:mrow><mml:mi mathvariant="italic">log</mml:mi><mml:mrow><mml:mo class="MathClass-open">(</mml:mo><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mo class="MathClass-close">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi mathvariant="italic">log</mml:mi><mml:mrow><mml:mo class="MathClass-open">(</mml:mo><mml:mrow><mml:mi>R</mml:mi></mml:mrow><mml:mo class="MathClass-close">)</mml:mo></mml:mrow></mml:mrow></mml:mfrac></mml:math></inline-formula>, <inline-formula><mml:math id="M15"><mml:mfrac><mml:mrow><mml:mi mathvariant="italic">log</mml:mi><mml:mrow><mml:mo class="MathClass-open">(</mml:mo><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mo class="MathClass-close">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi mathvariant="italic">log</mml:mi><mml:mrow><mml:mo class="MathClass-open">(</mml:mo><mml:mrow><mml:mi>G</mml:mi></mml:mrow><mml:mo class="MathClass-close">)</mml:mo></mml:mrow></mml:mrow></mml:mfrac></mml:math></inline-formula>, <inline-formula><mml:math id="M16"><mml:mfrac><mml:mrow><mml:mi mathvariant="italic">log</mml:mi><mml:mrow><mml:mo class="MathClass-open">(</mml:mo><mml:mrow><mml:mi>R</mml:mi></mml:mrow><mml:mo class="MathClass-close">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi mathvariant="italic">log</mml:mi><mml:mrow><mml:mo class="MathClass-open">(</mml:mo><mml:mrow><mml:mi>G</mml:mi></mml:mrow><mml:mo class="MathClass-close">)</mml:mo></mml:mrow></mml:mrow></mml:mfrac></mml:math></inline-formula>, <inline-formula><mml:math id="M17"><mml:mfrac><mml:mrow><mml:msqrt><mml:mrow><mml:mrow><mml:mo class="MathClass-open">&#x0007B;</mml:mo><mml:mrow><mml:mi>R</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>G</mml:mi><mml:mo class="MathClass-punc">,</mml:mo><mml:mspace width="2.56804pt" class="tmspace"/><mml:mi>B</mml:mi></mml:mrow><mml:mo class="MathClass-close">&#x0007D;</mml:mo></mml:mrow></mml:mrow></mml:msqrt></mml:mrow><mml:mrow><mml:msqrt><mml:mrow><mml:mi>C</mml:mi></mml:mrow></mml:msqrt></mml:mrow></mml:mfrac></mml:math></inline-formula>, <inline-formula><mml:math id="M18"><mml:mfrac><mml:mrow><mml:msqrt><mml:mrow><mml:mi>R</mml:mi></mml:mrow></mml:msqrt></mml:mrow><mml:mrow><mml:msqrt><mml:mrow><mml:mi>G</mml:mi></mml:mrow></mml:msqrt></mml:mrow></mml:mfrac></mml:math></inline-formula>, <inline-formula><mml:math id="M19"><mml:mfrac><mml:mrow><mml:msqrt><mml:mrow><mml:mi>B</mml:mi></mml:mrow></mml:msqrt></mml:mrow><mml:mrow><mml:msqrt><mml:mrow><mml:mi>R</mml:mi></mml:mrow></mml:msqrt></mml:mrow></mml:mfrac></mml:math></inline-formula>, and <inline-formula><mml:math id="M20"><mml:mfrac><mml:mrow><mml:msqrt><mml:mrow><mml:mi>B</mml:mi></mml:mrow></mml:msqrt></mml:mrow><mml:mrow><mml:msqrt><mml:mrow><mml:mi>G</mml:mi></mml:mrow></mml:msqrt></mml:mrow></mml:mfrac></mml:math></inline-formula>, where R, G, B, and C represent the red, green, blue, and clear channel of the sensor, respectively.</p>
<p>Virtual light channel data were segmented using a sliding window. Different window sizes between 3 and 120&#x02009;s were investigated while maintaining a fixed overlap of 50% of the window size. For each virtual light channel, 18 time domain features were calculated as listed in Figure <xref ref-type="fig" rid="F5">5</xref>, resulting in a total of 756 features per window. Features were standardized by removing mean and dividing by their SD.</p>
<fig position="float" id="F5">
<label>Figure 5</label>
<caption><p><bold>Data processing pipeline for light sensor data from raw color channel data to classifier output</bold>. Virtual light channels were computed from color light sensor data. After sliding window segmentation, features were extracted. In addition to features listed, the 5, 10, 25, 75, 90, and 95% percentiles were calculated. Subsequently, a linear support vector machine was used to classify screen use episodes.</p></caption>
<graphic xlink:href="fict-04-00008-g005.tif"/>
</fig>
<p>A linear support vector machine (LSVM) was used to classify between screen use and no screen use. PC and TV states described in the protocol were considered as screen use class, thus making detection of screen use independent of the displayed content. After training the LSVM on data from all but one sensor, data from the remaining sensor were used for prediction. Figure <xref ref-type="fig" rid="F5">5</xref> depicts the data-processing pipeline from raw sensor data to classifier output.</p>
</sec>
</sec>
<sec id="S4-3">
<label>4.3</label> <title>Lab Study</title>
<p>We evaluated screen use detection using WISEglass in the lab study. For all participants, we used the same viewing distances and devices, but did not control ambient light intensity. Fourteen participants (2 females, 12 males, between 20 and 39&#x02009;years of age) were asked to read a print magazine, watch a documentary, and use a PC for 20&#x02009;min per activity.</p>
<sec id="S4-3-4">
<label>4.3.1</label> <title>Sensing Approach</title>
<p>Color light sensor data were sampled and stored in flash memory with a sampling rate of 6.5&#x02009;Hz and a gain factor of 1. Data were downloaded after each participant completed the protocol.</p>
</sec>
<sec id="S4-3-5">
<label>4.3.2</label> <title>Recording Protocol</title>
<p>After arrival, the protocol was explained to participants in detail.</p>
<p>Participants would perform the following activities for 20&#x02009;min each: (1) reading a print article, (2) watching a documentary about coffee on a 27&#x02033; Samsung SyncMaster P2770HD TV at 140&#x02009;cm distance, and (3) browsing the Internet using an HP EliteDisplay E241i 24 in screen at 70&#x02009;cm distance. The typical ergonomic distance is between 50 and 70&#x02009;cm, where 50&#x02009;cm was used in the test bench analysis. Figure <xref ref-type="fig" rid="F6">6</xref> depicts the activities as performed by each participant.</p>
<fig position="float" id="F6">
<label>Figure 6</label>
<caption><p><bold>Lab study</bold>. Activities reading a print magazine, watching TV, and PC use were performed by participants for 20&#x02009;min per activity.</p></caption>
<graphic xlink:href="fict-04-00008-g006.tif"/>
</fig>
<p>Both screens were reset to factory defaults prior to recordings. Participants wore WISEglass during data recordings. Ambient light intensity was measured with a standard lux meter (AMPROBE LM-120) at the beginning of each recording. A total of 15.2&#x02009;h of data were recorded during the lab study.</p>
</sec>
<sec id="S4-3-6">
<label>4.3.3</label> <title>Evaluation Procedure</title>
<p>We applied the same evaluation procedure as described in Section <xref ref-type="sec" rid="S4-2-3">4.2.3</xref> with the following changes. To reduce the number of features, mRMR (Peng et al., <xref ref-type="bibr" rid="B20">2005</xref>) feature selection was applied. Different numbers of features were evaluated as described in Section <xref ref-type="sec" rid="S5">5</xref>. We used the same LSVM classifier as in the test bench but applied Leave-one-participant-Out (LOPO) cross-validation jointly for feature selection and classification.</p>
</sec>
</sec>
<sec id="S4-4">
<label>4.4</label> <title>ADL Study</title>
<p>To evaluate if screen use can be distinguished from other typical ADLs, we used data from a previous study. In total, nine participants (3 females, 6 males, between 20 and 27&#x02009;years of age) were involved in the study. The ADL data set was recorded to evaluate activity recognition using the WISEglass inertial measurement unit (Wahl et al., <xref ref-type="bibr" rid="B29">2015c</xref>).</p>
<sec id="S4-4-7">
<label>4.4.1</label> <title>Sensing Approach</title>
<p>Participants wore WISEglass for the duration of the recordings. Data from the light sensor were sampled and stored in flash memory at 6.5&#x02009;Hz.</p>
</sec>
<sec id="S4-4-8">
<label>4.4.2</label> <title>Recording Protocol</title>
<p>The study protocol was designed to include many typical ADLs across a regular day. Participants executed a scripted protocol while being followed by an observer labeling the data during recording time using the ACTLog application for Android (Spina et al., <xref ref-type="bibr" rid="B24">2013</xref>). In total, 66.08&#x02009;h of data were recorded. After arrival, the study protocol was explained in detail. Participants were compensated for their efforts with a 25 Euro Amazon voucher and three meals during the recordings. Data from two participants were excluded due to hardware issues during recordings leaving seven participants for the ADL analyses. Table <xref ref-type="table" rid="T2">2</xref> provides a listing of the performed ADLs, their total recording duration, and the activity cluster they belong to.</p>
<table-wrap position="float" id="T2">
<label>Table 2</label>
<caption><p><bold>ADL study</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">No.</th>
<th align="left">Cluster</th>
<th align="left">Activity</th>
<th align="center">Duration [min:s]</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">1</td>
<td align="left">Eat</td>
<td align="left">Breakfast</td>
<td align="center">84:43</td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">Lunch</td>
<td align="center">156:49</td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">Dinner</td>
<td align="center">181:53</td>
</tr>
<tr>
<td align="left" colspan="4"><hr/></td>
</tr>
<tr>
<td align="left">2</td>
<td align="left">Walk</td>
<td align="left">Lab to bathroom</td>
<td align="center">17:37</td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">Lab to gym</td>
<td align="center">36:27</td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">On treadmill 2&#x02009;km/h</td>
<td align="center">47:06</td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">Gym to lab</td>
<td align="center">35:56</td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">Lab to cafeteria</td>
<td align="center">20:11</td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">Queuing for lunch</td>
<td align="center">6:28</td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">Picking up lunch</td>
<td align="center">13:20</td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">Cafeteria to lab</td>
<td align="center">25:28</td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">Lab to restaurant</td>
<td align="center">58:36</td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">Restaurant to lab</td>
<td align="center">55:26</td>
</tr>
<tr>
<td align="left" colspan="4"><hr/></td>
</tr>
<tr>
<td align="left">3</td>
<td align="left">Brush</td>
<td align="left">Teeth</td>
<td align="center">39:17</td>
</tr>
<tr>
<td align="left" colspan="4"><hr/></td>
</tr>
<tr>
<td align="left">4</td>
<td align="left">Stairs</td>
<td align="left">Walking</td>
<td align="center">19:49</td>
</tr>
<tr>
<td align="left" colspan="4"><hr/></td>
</tr>
<tr>
<td align="left">5</td>
<td align="left">Jog</td>
<td align="left">On treadmill 5&#x02009;km/h</td>
<td align="center">45:27</td>
</tr>
<tr>
<td align="left" colspan="4"><hr/></td>
</tr>
<tr>
<td align="left">6</td>
<td align="left">Cycle</td>
<td align="left">On gym trainer</td>
<td align="center">91:14</td>
</tr>
<tr>
<td align="left" colspan="4"><hr/></td>
</tr>
<tr>
<td align="left">7</td>
<td align="left">Read</td>
<td align="left">A book</td>
<td align="center">276:31</td>
</tr>
<tr>
<td align="left" colspan="4"><hr/></td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">Desk work</td>
<td align="center">273:34</td>
</tr>
<tr>
<td align="left" colspan="4"><hr/></td>
</tr>
<tr>
<td align="left">8</td>
<td align="left">Screen</td>
<td align="left">Computer work</td>
<td align="center">253:50</td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">Watching movie</td>
<td align="center">187:56</td>
</tr>
<tr>
<td align="left" colspan="4"><hr/></td>
</tr>
<tr>
<td align="left">9</td>
<td align="left">Cleaning</td>
<td align="left">Vacuuming</td>
<td align="center">45:22</td>
</tr>
<tr>
<td align="left"/>
<td align="center"/>
<td align="left">Wiping tables</td>
<td align="center">45:02</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>Activities and their recorded total duration for nine participants</italic>.</p>
</table-wrap-foot>
</table-wrap>
<p>To evaluate the detection of screen use, we used the computer work activity as the screen use class. We omitted data of the unlabeled class and from the watching a movie activity, where participants looked at a projector screen. Projectors are different from regular screen, use and this is not the goal of this work.</p>
</sec>
<sec id="S4-4-9">
<label>4.4.3</label> <title>Evaluation Procedure</title>
<p>For the ADL study, we modified the evaluation procedure described in Section <xref ref-type="sec" rid="S4-2-3">4.2.3</xref> and Section <xref ref-type="sec" rid="S4-3-6">4.3.3</xref> as follows.</p>
<p>All results for the ADL data set were computed using the best 200 features selected by mRMR feature selection. Feature selection was performed on training data only for each validation fold.</p>
<p>The percentage of screen use in the ADL data set was approximately 11% of the entire data set. We applied downsampling to adapt the ratio of screen use per participant from 10 to 90% in steps of 10%. Downsampling was performed by randomly removing windows of 5&#x02009;s of the desired class. To ensure stability of our results, we repeated the selection process 10 times per screen use ratio and averaged the results.</p>
</sec>
</sec>
</sec>
<sec id="S5">
<label>5</label> <title>Results</title>
<p>In this section, we first analyze the impact of screen use depending on content and screen type. Subsequently, we present screen use evaluation results for test bench recordings, lab study, and ADL study.</p>
<sec id="S5-5">
<label>5.1</label> <title>Impact of Screen Content and Devices</title>
<p>For typical PC content, backgrounds are often white, resulting in a majority of pixels emitting light. For TV content, backgrounds are rather dark, resulting in a majority of pixels emitting little to no light. We measured irradiance using the reference spectrometer as described for the test bench investigation (Section <xref ref-type="sec" rid="S4-2">4.2</xref>) during PC use and watching TV activities and found average irradiance during PC use to be 3 times higher than for watching TV with 0.03&#x02009;mW/cm<sup>2</sup> for PC use and 0.01&#x02009;mW/cm<sup>2</sup> for watching TV. Thus, when using the same screen for watching TV at 200&#x02009;cm distance, light exposure is 48 times lower compared to PC use at 50&#x02009;cm distance.</p>
<p>Table <xref ref-type="table" rid="T3">3</xref> shows light intensity at the user&#x02019;s eye for typical devices and content types. According to our results, PC use with a desktop setup has the highest light intensity at the user&#x02019;s eyes and was thus investigated further. Light intensity is a primary consideration in this investigation, as intensities above 50&#x02009;lx have substantial impact on the circadian phase (Wood et al., <xref ref-type="bibr" rid="B31">2013</xref>). Overall, watching TV is less critical than PC use due to viewing distance and content type. Screen size was also important. While bright, smartphone screens are too small to critically influence circadian phase considering that light intensities are typically below 50&#x02009;lx.</p>
<table-wrap position="float" id="T3">
<label>Table 3</label>
<caption><p><bold>Light intensity at the user&#x02019;s eyes during PC use and watching TV for typical devices</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Device</th>
<th align="center">Aspect ratio</th>
<th align="center">Diagonal [cm]</th>
<th align="center">Area [m<sup>2</sup>]</th>
<th align="center">Luminance [cd/m<sup>2</sup>]</th>
<th align="center">Luminous intensity [cd]</th>
<th align="center">Distance [m]</th>
<th align="center">Light intensity PC [lx]</th>
<th align="center">Light intensity TV [lx]</th>
</tr>
<tr>
<th align="center"/>
<th align="center"><italic>w</italic>:<italic>h</italic></th>
<th align="center"><italic>l</italic></th>
<th align="center"><italic>A</italic></th>
<th align="center"><italic>L</italic></th>
<th align="center"><italic>I</italic></th>
<th align="center"><italic>d</italic></th>
<th align="center"><italic>E<sub>PC</sub></italic></th>
<th align="center"><italic>E<sub>TV</sub></italic></th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Galaxy S7</td>
<td align="center">16:9</td>
<td align="center">12.95</td>
<td align="center">0.0072</td>
<td align="center">350</td>
<td align="center">2.51</td>
<td align="center">0.32</td>
<td align="center">24.51</td>
<td align="center">8.17</td>
</tr>
<tr>
<td align="left">iPhone 6s</td>
<td align="center">16:9</td>
<td align="center">11.94</td>
<td align="center">0.0061</td>
<td align="center">50</td>
<td align="center">3.35</td>
<td align="center">0.32</td>
<td align="center">32.71</td>
<td align="center">10.90</td>
</tr>
<tr>
<td align="left">iPad air 2</td>
<td align="center">4:3</td>
<td align="center">24.64</td>
<td align="center">0.0291</td>
<td align="center">420</td>
<td align="center">12.24</td>
<td align="center">0.5</td>
<td align="center">48.95</td>
<td align="center">16.32</td>
</tr>
<tr>
<td align="left">Dell XPS 13</td>
<td align="center">16:9</td>
<td align="center">33.78</td>
<td align="center">0.0488</td>
<td align="center">276</td>
<td align="center">13.46</td>
<td align="center">0.5</td>
<td align="center">53.84</td>
<td align="center">17.95</td>
</tr>
<tr>
<td align="left">MacBook Pro 15</td>
<td align="center">16:10</td>
<td align="center">39.12</td>
<td align="center">0.0688</td>
<td align="center">315</td>
<td align="center">21.66</td>
<td align="center">0.5</td>
<td align="center">86.65</td>
<td align="center">28.88</td>
</tr>
<tr>
<td align="left">HP monitor</td>
<td align="center">16:10</td>
<td align="center">60.96</td>
<td align="center">0.1670</td>
<td align="center">250</td>
<td align="center">41.75</td>
<td align="center">0.6</td>
<td align="center">115.98</td>
<td align="center">38.66</td>
</tr>
<tr>
<td align="left">Samsung TV</td>
<td align="center">16:9</td>
<td align="center">101.60</td>
<td align="center">0.4411</td>
<td align="center">120</td>
<td align="center">52.93</td>
<td align="center">2.0</td>
<td align="center">13.23</td>
<td align="center">4.41</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>We used screen size, luminance information, and typical viewing distances to calculate light intensity at the user&#x02019;s eyes as described in Section <xref ref-type="sec" rid="S4-1">4.1</xref></italic>.</p>
</table-wrap-foot>
</table-wrap>
</sec>
<sec id="S5-6">
<label>5.2</label> <title>Test Bench Recordings</title>
<p>By using the extracted feature and LSVM classification, screen use could be detected independent of lamp type and ambient light intensity. Averaging over all lamp types, a receiver operating characteristic area under curve (ROC AUC) of &#x02248;1 was achieved, indicating that screen use could be detected independent of the ambient light type. Table <xref ref-type="table" rid="T4">4</xref> summarizes the recognition results for the test bench recordings. For screen use &#x0007E;199&#x02009;h (66.6%) and for non-screen use &#x0007E;99.5&#x02009;h (33.3%) of data were used.</p>
<table-wrap position="float" id="T4">
<label>Table 4</label>
<caption><p><bold>Test bench</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Metric</th>
<th align="center">Halogen</th>
<th align="center">LED</th>
<th align="center">Warm CCFL</th>
<th align="center">Cold CCFL</th>
<th align="center">Dark</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">ROC AUC</td>
<td align="center">0.99</td>
<td align="center">1.0</td>
<td align="center">1.0</td>
<td align="center">1.0</td>
<td align="center">1.0</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>Results for recordings using different ambient light sources. Results show that screen use can be detected independent of ambient light type</italic>.</p>
</table-wrap-foot>
</table-wrap>
</sec>
<sec id="S5-7">
<label>5.3</label> <title>Lab Study</title>
<p>We investigated the impact of window size on screen use detection performance for the Lab study data. For all window sizes, an ROC AUC was between 0.85 and 0.90. Windows of 5&#x02009;s produced the largest ROC AUC at the shortest recognition delay. For an online implementation of screen use, detection recognition delay is crucial. Thus, we chose for 5-s windows for further analysis.</p>
<p>To analyze the impact of feature reduction on detection performance, we applied mRMR feature selection. Figure <xref ref-type="fig" rid="F7">7</xref> shows the number of features used vs. ROC AUC performance. Using &#x02248;70 or more features increased performance. For 200 features, results were stable with an average ROC AUC above 0.9.</p>
<fig position="float" id="F7">
<label>Figure 7</label>
<caption><p><bold>Lab study</bold>. ROC AUC vs. feature number is plotted where gray diamonds represent outliers. Performance drops when using less than &#x02248;70 features.</p></caption>
<graphic xlink:href="fict-04-00008-g007.tif"/>
</fig>
<p>Figure <xref ref-type="fig" rid="F8">8</xref> shows ROC AUC over ambient light intensity. For 10 of 14 participants, an ROC AUC above 0.9 was achieved. In all 7 samples with an ambient light intensity below 200&#x02009;lx, the ROC AUC was above 0.9. In 3 of 14 participants, ambient light intensity was above 500&#x02009;lx, which is the recommended light intensity for working environments (Deutsche Gesetzliche Unfallversicherung e.V. (DGUV), <xref ref-type="bibr" rid="B10">2015</xref>), and an ROC AUC dropped below 0.8.</p>
<fig position="float" id="F8">
<label>Figure 8</label>
<caption><p><bold>Lab study</bold>. Recognition performance over ambient light intensity. Each participant is represented by a dot. The dashed trend line shows the relationship between ambient light intensity and ROC AUC. The 95% confidence interval around the trend line is shaded gray. It can be seen that screen use detection performance decreased when ambient light intensity increased.</p></caption>
<graphic xlink:href="fict-04-00008-g008.tif"/>
</fig>
</sec>
<sec id="S5-8">
<label>5.4</label> <title>ADL Study</title>
<p>We evaluated screen use detection within a large variety of ADLs. Figure <xref ref-type="fig" rid="F9">9</xref>A depicts positive predictive value (PPV), true positive rate (TPR), and false positive rate (FPR) for different screen use percentages. It can be seen that PPV increases with screen use percentage from 0.45 to 0.98, while TPR and FPR did not show relevant changes.</p>
<fig position="float" id="F9">
<label>Figure 9</label>
<caption><p><bold>ADL study</bold>. <bold>(A)</bold> PPV, TPR, and FPR for 10&#x02013;90% screen use. Gray diamonds represent outliers. <bold>(B)</bold> Per participant ROC AUC score at 30% screen use.</p></caption>
<graphic xlink:href="fict-04-00008-g009.tif"/>
</fig>
<p>Figure <xref ref-type="fig" rid="F9">9</xref>B shows ROC AUC per participant for 30% screen use. For all participants, an ROC AUC was between 0.69 and 0.91 with an average of 0.83. ROC AUC performance was stable per participant.</p>
</sec>
</sec>
<sec id="S6" sec-type="discussion">
<label>6</label> <title>Discussion</title>
<p>Today computers are on almost every desk, and the amount of work being digitalized is still growing. Further growth of screen use during the day and the evening may lead to circadian misalignment and an increase of people suffering from eyestrain. We thus expect that screen use above 30% is reasonable and potentially growing in the future. Our screen use detection results during ADL showed relevant performance with an average ROC AUC of 0.83 at 30% screen use.</p>
<p>Previous study protocols administered continuous screen use for 1&#x02013;5&#x02009;h (Cajochen et al., <xref ref-type="bibr" rid="B7">2011</xref>; Heath et al., <xref ref-type="bibr" rid="B14">2014</xref>; Chang et al., <xref ref-type="bibr" rid="B9">2015</xref>; van der Lely et al., <xref ref-type="bibr" rid="B26">2015</xref>) when investigating circadian phase shifts due to screen use. Wood et al. (<xref ref-type="bibr" rid="B31">2013</xref>) found measurable melatonin suppression after 2&#x02009;h of tablet use. Our proposed detection only required 5&#x02009;s of data, thus detecting screen use well before a measurable suppression of melatonin. Depending on real-time detection requirements, majority voting could be applied over multiple windows to smoothen the classifier result.</p>
<p>Circadian phase shift can be induced by light sources other than screens. For example, looking at a high-energy blue light source could also shift the circadian phase but may not be detected by our screen use detector. Smart eyeglasses measure light intensity for each spectral component and thus can measure different sources of phase shift. In contrast to other light sources, it is possible to minimize the risk of phase shift due to screen use by adapting the screen color profile. Therefore, the detection of screen use is of interest.</p>
<sec id="S6-9">
<label>6.1</label> <title>Performance Metrics</title>
<p>For screen use detection, a low false alarm rate is crucial as false alarms tend to frustrate users. While ROC AUC is a good performance measure for balanced data sets, we additionally derived precision of the screen use detection (PPV), correctly detected screen use (TPR), and falsely reported screen use (FPR). TPR reports the ratio of correctly detected screen use. FPR describes the ratio of falsely reported screen use. An ROC AUC is computed from TPR and FPR only and is thus insensitive to class skew (Fawcett, <xref ref-type="bibr" rid="B13">2004</xref>). In unbalanced data, where screen use instances are rare (e.g., for 10% screen use), FPR can be low and TPR can be high, even though PPV is low.</p>
</sec>
<sec id="S6-10">
<label>6.2</label> <title>Ambient Light</title>
<p>Our lab study showed that screen use detection works best for ambient light intensities below 200&#x02009;lx. Interpreting light emitted by the screen as signal and ambient light as noise, an increase in ambient light adds noise, thus decreases signal-to-noise ratio. However, increased ambient light eliminates the need for screen use detection. For example, circadian phase shift induced by screen use typically occurs at low ambient light intensity only. Circadian phase response curves express impact of light exposure on circadian phase depending on its timing.</p>
<p>Light intensities span multiple orders of magnitude ranging from 0&#x02013;1,000&#x02009;lx indoors to more than 100,000&#x02009;lx outdoors on a sunny day. Our screen detection approach is meant for indoor screen use. In outdoors, the sunlight already influences the circadian phase. ADL recordings included outdoor episodes of walking, and our system did not falsely report screen use due to the high brightness values.</p>
<p>Absolute light intensities differ depending on the environment, lamp type, viewing distance, content, time of day, and weather. Virtual light channels based on ratios of the different light color components made the feature space independent from absolute light intensities. Further calibration of the color light channels due to their different energy-related response was not needed, since the subsequent pattern analysis weights the features with respect to the classification task.</p>
</sec>
<sec id="S6-11">
<label>6.3</label> <title>Screen Use Activities</title>
<p>In the test bench study, we combined PC and TV content into one screen use class to ensure that screen use could be detected independent of displayed content. For both content types, distance between screen and sensor was 50&#x02009;cm. The combination of video content and close viewing distance could happen in deployment, e.g., when watching video content online using a notebook. However, our calculations (Table <xref ref-type="table" rid="T3">3</xref>) showed that TV use, even at a short viewing distance, has no substantial impact on the circadian phase.</p>
<p>Section <xref ref-type="sec" rid="S5-5">5.1</xref> reported on light intensity at the user&#x02019;s eyes for different device and content types. Previous research investigated the influence of screen use on circadian phase (Cajochen et al., <xref ref-type="bibr" rid="B7">2011</xref>) and found that screen use can induce circadian phase shifts. Wood et al. (<xref ref-type="bibr" rid="B31">2013</xref>) suggested 50&#x02009;lx as the critical threshold for the circadian system. Our calculations showed that TVs, while typically being the largest screens used, produce less than 50&#x02009;lx at the user&#x02019;s eyes due to the large viewing distance. Typical TV content caused one-third of the irradiance compared to PC use. Heath et al. (<xref ref-type="bibr" rid="B14">2014</xref>) found that 1&#x02009;h of tablet use prior to bedtime did not significantly impact circadian phase. While used at a close distance, smartphones and tablets are not large enough to reach the 50-lx threshold. PC use was the only combination of distance, content type, and screen size that could produce a light intensity of over 50&#x02009;lx at the user&#x02019;s eye. In contrast, the work of Chang et al. (<xref ref-type="bibr" rid="B9">2015</xref>) found significant melatonin suppression after 5&#x02009;days of using a backlit eReader for 4&#x02009;h prior to bedtime at 30&#x02009;lx only. However, their control was reading a paper in ambient light conditions of less than 1&#x02009;lx, the amount of ambient light in deep twilight. Such low light intensities might make reading difficult. We chose to investigate PC use with a desktop setup because it has the strongest light intensity at the user&#x02019;s eyes.</p>
</sec>
<sec id="S6-12">
<label>6.4</label> <title>Assumptions</title>
<p>We used PC work as the prototype activity for high irradiance screen use activities as we assumed it was the activity with yielding the highest light intensities at the user&#x02019;s eyes. Typically during PC work, backgrounds are white, e.g., when editing a document or browsing the web, resulting in a majority of bright pixels.</p>
<p>Test bench results showed that screen use was detected well when sensors were mounted in a fixed position independent of the amount or type of artificial light, or the content being displayed on the screen. In reality, people move their head during screen use and sometimes look away from the screen for brief moments. Such movements are hard to annotate during an observational study and thus introduced noise to our screen detection. Head motion was clearly visible in the raw light sensor signal of the ADL study. Additional sensor information, e.g., from a motion sensor could be used to detect head motion and thus interruptions of screen use.</p>
</sec>
<sec id="S6-13">
<label>6.5</label> <title>Practical Applications</title>
<p>Possible intervention measures have been investigated (Heath et al., <xref ref-type="bibr" rid="B14">2014</xref>; van der Lely et al., <xref ref-type="bibr" rid="B26">2015</xref>). The latest update of Apple iOS introduced a software feature to adapt screen color profiles to minimize unwanted circadian phase shifts. In this work, we showed that detecting screen use is feasible with the light sensor of smart eyeglasses. The screen use detection could be used to control the screen color profile.</p>
<p>Ambient monitoring methods could detect the presence in front of a desktop screen, e.g., camera-based face detection or ultrasound-based proximity detection (Jaramillo-Garcia et al., <xref ref-type="bibr" rid="B16">2014</xref>). However, screen-based light intensity must be assigned to an individual user to implement alerts, where wearable systems are advantageous. In addition with wearable systems, privacy concerns could be easily addressed. An ambient monitoring solution requires one setup per screen, and the presence information alone is not sufficient to identify relevant screen illuminance as users may already use a software intervention tool, e.g., f.lux to adapt the screen&#x02019;s spectral composition. In addition, a presence detector cannot distinguish between being next to and actually looking at a screen. With the light sensor embedded into the smart eyeglasses bridge, the wearable system captures a wearer&#x02019;s field of view.</p>
<p>We intentionally chose eyeglasses as a sensing platform because regular eyeglasses are worn by many. Smart eyeglasses, like regular eyeglasses, are worn throughout the day. In contrast to head-worn light measurement devices such as the Daysimeter (Bierman et al., <xref ref-type="bibr" rid="B5">2005</xref>), smart eyeglasses may be used for multiple applications (Amft et al., <xref ref-type="bibr" rid="B1">2015</xref>). In contrast to other smart eyewear, where the focus is on interaction and displaying information, e.g., Google Glass, our work is focused on sensing. While the camera of Google Glass could be used as a color light sensor substitute, the larger power consumption may limit continuous use.</p>
<p>Detecting screen use is challenging as the screen-emitted light is distributed over the visible spectrum, with frequency components close to other light sources. Screen use was found to suppress evening rise in endogenous melatonin significantly and thus misaligning circadian rhythms (Cajochen et al., <xref ref-type="bibr" rid="B7">2011</xref>). Screen use can also cause repetitive strain injuries (RSI), such as eye strain (The Vision Council of America, <xref ref-type="bibr" rid="B25">2016</xref>). With our screen use detection, it is possible to support users by suggesting intervention measures when needed, including adapting a screen&#x02019;s color profile to prevent circadian phase shifts and taking regular breaks to reduce RSI risks.</p>
<p>Screen use detection can be used beyond detecting impact on the circadian phase. Regular breaks are important to prevent eye strain, the most common repetitive strain injury (The Vision Council of America, <xref ref-type="bibr" rid="B25">2016</xref>). Screen use detection can suggest breaks during computer work and remind users to implement the 20-20-20 rule: every 20&#x02009;min of screen work take a 20-s break and look at something at least 20 feet away.</p>
</sec>
</sec>
<sec id="S7">
<label>7</label> <title>Conclusion</title>
<p>We introduced an approach for screen use detection based on a color light sensor embedded in smart eyeglasses and evaluated it on three studies. Our evaluation showed perfect results for the test bench analysis. Lab and ADL study results introduced noise due to head motion and ambient light variation.</p>
<p>Lab study results revealed that screen use detection performance is related to ambient light intensity. Screen use was detected with over 0.9 ROC AUC at an ambient light intensity below 200&#x02009;lx. A data set of typical ADLs was used to further evaluate screen use detection. Screen use was detected with an average ROC AUC of 0.83 for 30% screen use. Detection performance was evaluated on person-independent models for lab and ADL data sets.</p>
<p>Our work could be applied to other wearables as few hardware components are required. A color light sensor, battery, and wireless interface could be embedded into smart jewelry, e.g., a brooch or necklace. However, it is essential that the light sensor&#x02019;s field of view is aligned with the wearer&#x02019;s eyes. Our screen use detection algorithm is independent of the sensor position.</p>
<p>The proposed system detected screen use quickly due to the short window size of 5&#x02009;s. Screen use information could be used to prevent eye strain by reminding users to take regular breaks. Undesired impact of screen use, e.g., circadian phase shift, could be minimized by either notifying the user or (automatically) activating a software intervention measure, e.g., f.lux. We thus conclude that smart eyeglasses are a feasible platform for screen use detection.</p>
</sec>
<sec id="S8">
<title>Ethics Statement</title>
<p>Our lab study and ADL study were exempt as they were observational and non-invasive while privacy of participants was maintained. All recordings were stored anonymously meaning that names of participants were replaced by a participant ID. In the lab study, the total recording period was 1&#x02009;h during which participants read, watched TV, and browsed the Internet. None of those activities posed a risk to the participants. ADL study recordings lasted approximately 10&#x02009;h per participant. At all times, participants were supervised by a study manager. Typical activities of daily living were carried out, e.g., eating, sitting, walking that did not pose a risk to participants. In the lab study and the ADL study, we explained the protocol to participants in detail. Participants were then asked to sign a consent form prior to recordings. Participants knew it was possible to withdraw their consent and stop recordings at any time during recordings without facing any consequences. Not applicable as our population consisted of non-vulnerable, healthy adults only.</p>
</sec>
<sec id="S9">
<title>Author Contributions</title>
<p>FW and OA developed research question and study designs and wrote the manuscript. Data acquisition and analysis were done by FW and JK, guided by OA. JK commented on the manuscript.</p>
</sec>
<sec id="S10">
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<sec id="S11">
<title>Funding</title>
<p>This research was supported by the Dutch Technology Foundation STW under grant number 12184.</p>
</sec>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Amft</surname> <given-names>O.</given-names></name> <name><surname>Wahl</surname> <given-names>F.</given-names></name> <name><surname>Ishimaru</surname> <given-names>S.</given-names></name> <name><surname>Kunze</surname> <given-names>K.</given-names></name></person-group> (<year>2015</year>). <article-title>Making regular eyeglasses smart</article-title>. <source>IEEE Pervasive Comput.</source> <volume>14</volume>, <fpage>32</fpage>&#x02013;<lpage>43</lpage>.<pub-id pub-id-type="doi">10.1109/MPRV.2015.60</pub-id></citation></ref>
<ref id="B2"><citation citation-type="web"><collab>ams AG</collab>. (<year>2013</year>). <source>TCS34725</source>. Available at: <uri xlink:href="http://ams.com/eng/Products/Light-Sensors/Color-Sensor/TCS34725">http://ams.com/eng/Products/Light-Sensors/Color-Sensor/TCS34725</uri></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bababekova</surname> <given-names>Y.</given-names></name> <name><surname>Rosenfield</surname> <given-names>M.</given-names></name> <name><surname>Hue</surname> <given-names>J. E.</given-names></name> <name><surname>Huang</surname> <given-names>R. R.</given-names></name></person-group> (<year>2011</year>). <article-title>Font size and viewing distance of handheld smart phones</article-title>. <source>Optom. Vis. Sci.</source> <volume>88</volume>, <fpage>795</fpage>&#x02013;<lpage>797</lpage>.<pub-id pub-id-type="doi">10.1097/OPX.0b013e3182198792</pub-id><pub-id pub-id-type="pmid">21499163</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bannach</surname> <given-names>D.</given-names></name> <name><surname>Amft</surname> <given-names>O.</given-names></name> <name><surname>Lukowicz</surname> <given-names>P.</given-names></name></person-group> (<year>2008</year>). <article-title>Rapid prototyping of activity recognition applications</article-title>. <source>IEEE Pervasive Comput.</source> <volume>7</volume>, <fpage>22</fpage>&#x02013;<lpage>31</lpage>.<pub-id pub-id-type="doi">10.1109/MPRV.2008.36</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bierman</surname> <given-names>A.</given-names></name> <name><surname>Klein</surname> <given-names>T. R.</given-names></name> <name><surname>Rea</surname> <given-names>M. S.</given-names></name></person-group> (<year>2005</year>). <article-title>The Daysimeter: a device for measuring optical radiation as a stimulus for the human circadian system</article-title>. <source>Meas. Sci. Technol.</source> <volume>16</volume>, <fpage>2292</fpage>&#x02013;<lpage>2299</lpage>.<pub-id pub-id-type="doi">10.1088/0957-0233/16/11/023</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brainard</surname> <given-names>G. C.</given-names></name> <name><surname>Hanifin</surname> <given-names>J. P.</given-names></name> <name><surname>Greeson</surname> <given-names>J. M.</given-names></name> <name><surname>Byrne</surname> <given-names>B.</given-names></name> <name><surname>Glickman</surname> <given-names>G.</given-names></name> <name><surname>Gerner</surname> <given-names>E.</given-names></name> <etal/></person-group> (<year>2001</year>). <article-title>Action spectrum for melatonin regulation in humans: evidence for a novel circadian photoreceptor</article-title>. <source>J. Neurosci.</source> <volume>21</volume>, <fpage>6405</fpage>&#x02013;<lpage>6412</lpage>.<pub-id pub-id-type="pmid">11487664</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cajochen</surname> <given-names>C.</given-names></name> <name><surname>Frey</surname> <given-names>S.</given-names></name> <name><surname>Anders</surname> <given-names>D.</given-names></name> <name><surname>Sp&#x000E4;ti</surname> <given-names>J.</given-names></name> <name><surname>Bues</surname> <given-names>M.</given-names></name> <name><surname>Pross</surname> <given-names>A.</given-names></name> <etal/></person-group> (<year>2011</year>). <article-title>Evening exposure to a light-emitting diodes (LED)-backlit computer screen affects circadian physiology and cognitive performance</article-title>. <source>J. Appl. Physiol.</source> <volume>110</volume>, <fpage>1432</fpage>&#x02013;<lpage>1438</lpage>.<pub-id pub-id-type="doi">10.1152/japplphysiol.00165.2011</pub-id><pub-id pub-id-type="pmid">21415172</pub-id></citation></ref>
<ref id="B8"><citation citation-type="web"><person-group person-group-type="author"><name><surname>Campbell</surname> <given-names>A.</given-names></name></person-group> (<year>2013</year>). <source>How to Hold Your iPad</source>. Available at: <uri xlink:href="https://alastairc.ac/2013/02/how-to-hold-your-ipad/">https://alastairc.ac/2013/02/how-to-hold-your-ipad/</uri></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chang</surname> <given-names>A.-M.</given-names></name> <name><surname>Aeschbach</surname> <given-names>D.</given-names></name> <name><surname>Duffy</surname> <given-names>J. F.</given-names></name> <name><surname>Czeisler</surname> <given-names>C. A.</given-names></name></person-group> (<year>2015</year>). <article-title>Evening use of light-emitting eReaders negatively affects sleep, circadian timing, and next-morning alertness</article-title>. <source>Proc. Natl. Acad. Sci. U.S.A.</source> <volume>112</volume>, <fpage>1232</fpage>&#x02013;<lpage>1237</lpage>.<pub-id pub-id-type="doi">10.1073/pnas.1418490112</pub-id><pub-id pub-id-type="pmid">25535358</pub-id></citation></ref>
<ref id="B10"><citation citation-type="book"><collab>Deutsche Gesetzliche Unfallversicherung e.V. (DGUV)</collab>. (<year>2015</year>). <source>DGUV Information 215-410 &#x02013; Bildschirm- und B&#x000FC;roarbeitspl&#x000E4;tze Leitfaden f&#x000FC;r die Gestaltung</source>. <publisher-loc>Berlin</publisher-loc>: <publisher-name>Deutsche Gesetzliche Unfallversicherung e.V. (DGUV)</publisher-name>.</citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Duffy</surname> <given-names>J. F.</given-names></name> <name><surname>Wright</surname> <given-names>K. P.</given-names></name></person-group> (<year>2005</year>). <article-title>Entrainment of the human circadian system by light</article-title>. <source>J. Biol. Rhythms</source> <volume>20</volume>, <fpage>326</fpage>&#x02013;<lpage>338</lpage>.<pub-id pub-id-type="doi">10.1177/0748730405277983</pub-id></citation></ref>
<ref id="B12"><citation citation-type="web"><collab>eMarketer</collab>. (<year>2016</year>). <source>US Adults Spend 5.5 Hours with Video Content Each Day</source>. Available at: <uri xlink:href="http://www.emarketer.com/Articles/Print.aspx?R=1012362">http://www.emarketer.com/Articles/Print.aspx?R&#x0003D;1012362</uri></citation></ref>
<ref id="B13"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Fawcett</surname> <given-names>T.</given-names></name></person-group> (<year>2004</year>). <source>ROC Graphs: Notes and Practical Considerations for Researchers</source>. Technical Report.</citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Heath</surname> <given-names>M.</given-names></name> <name><surname>Sutherland</surname> <given-names>C.</given-names></name> <name><surname>Bartel</surname> <given-names>K.</given-names></name> <name><surname>Gradisar</surname> <given-names>M.</given-names></name> <name><surname>Williamson</surname> <given-names>P.</given-names></name> <name><surname>Lovato</surname> <given-names>N.</given-names></name> <etal/></person-group> (<year>2014</year>). <article-title>Does one hour of bright or short-wavelength filtered tablet screenlight have a meaningful effect on adolescents&#x02019; pre-bedtime alertness, sleep, and daytime functioning?</article-title> <source>Chronobiol. Int.</source> <volume>31</volume>, <fpage>496</fpage>&#x02013;<lpage>505</lpage>.<pub-id pub-id-type="doi">10.3109/07420528.2013.872121</pub-id></citation></ref>
<ref id="B15"><citation citation-type="other"><collab>Institut f&#x000FC;r Demoskopie Allensbach</collab>. (<year>2014</year>). <source>Brillenstudie 2014</source>. Technical Report.</citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jaramillo-Garcia</surname> <given-names>P.</given-names></name> <name><surname>Lopera Gonzalez</surname> <given-names>L. I.</given-names></name> <name><surname>Amft</surname> <given-names>O.</given-names></name></person-group> (<year>2014</year>). <article-title>Using implicit user feedback to balance energy consumption and user comfort of proximity-controlled computer screens</article-title>. <source>J. Ambient Intell. Hum. Comput.</source> <volume>6</volume>, <fpage>207</fpage>&#x02013;<lpage>221</lpage>.<pub-id pub-id-type="doi">10.1007/s12652-014-0222-2</pub-id></citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kantermann</surname> <given-names>T.</given-names></name> <name><surname>Roenneberg</surname> <given-names>T.</given-names></name></person-group> (<year>2009</year>). <article-title>Is light-at-night a health risk factor or a health risk predictor?</article-title> <source>Chronobiol. Int.</source> <volume>26</volume>, <fpage>1069</fpage>&#x02013;<lpage>1074</lpage>.<pub-id pub-id-type="doi">10.3109/07420520903223984</pub-id><pub-id pub-id-type="pmid">19731106</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lee</surname> <given-names>D.-S.</given-names></name></person-group> (<year>2012</year>). <article-title>Preferred viewing distance of liquid crystal high-definition television</article-title>. <source>Appl. Ergon.</source> <volume>43</volume>, <fpage>151</fpage>&#x02013;<lpage>156</lpage>.<pub-id pub-id-type="doi">10.1016/j.apergo.2011.04.007</pub-id><pub-id pub-id-type="pmid">21529771</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nathan</surname> <given-names>J. G.</given-names></name> <name><surname>Anderson</surname> <given-names>D. R.</given-names></name> <name><surname>Field</surname> <given-names>D. E.</given-names></name> <name><surname>Collins</surname> <given-names>P.</given-names></name></person-group> (<year>1985</year>). <article-title>Television viewing at home: distances and visual angles of children and adults</article-title>. <source>Hum. Factors J. Hum. Factors Ergon. Soc.</source> <volume>27</volume>, <fpage>467</fpage>&#x02013;<lpage>476</lpage>.</citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Peng</surname> <given-names>H.</given-names></name> <name><surname>Long</surname> <given-names>F.</given-names></name> <name><surname>Ding</surname> <given-names>C.</given-names></name></person-group> (<year>2005</year>). <article-title>Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy</article-title>. <source>IEEE Trans. Pattern Anal. Mach. Intell.</source> <volume>27</volume>, <fpage>1226</fpage>&#x02013;<lpage>1238</lpage>.<pub-id pub-id-type="doi">10.1109/TPAMI.2005.159</pub-id><pub-id pub-id-type="pmid">16119262</pub-id></citation></ref>
<ref id="B21"><citation citation-type="web"><collab>PresseBox</collab>. (<year>2008</year>). <source>Deutsche sitzen lange vor dem Computer &#x02013; BITKOM &#x02013; Bundesverband Informationswirtschaft, Telekommunikation und neue Medien e.V. &#x02013; Pressemitteilung</source>. Available at: <uri xlink:href="http://www.pressebox.de/?boxid=200275">http://www.pressebox.de/?boxid&#x0003D;200275</uri></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Revell</surname> <given-names>V. L.</given-names></name> <name><surname>Eastman</surname> <given-names>C. I.</given-names></name></person-group> (<year>2005</year>). <article-title>How to trick mother nature into letting you fly around or stay up all night</article-title>. <source>J. Biol. Rhythms</source> <volume>20</volume>, <fpage>353</fpage>&#x02013;<lpage>365</lpage>.<pub-id pub-id-type="doi">10.1177/0748730405277233</pub-id><pub-id pub-id-type="pmid">16077154</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shieh</surname> <given-names>K.-K.</given-names></name> <name><surname>Lee</surname> <given-names>D.-S.</given-names></name></person-group> (<year>2007</year>). <article-title>Preferred viewing distance and screen angle of electronic paper displays</article-title>. <source>Appl. Ergon.</source> <volume>38</volume>, <fpage>601</fpage>&#x02013;<lpage>608</lpage>.<pub-id pub-id-type="doi">10.1016/j.apergo.2006.06.008</pub-id><pub-id pub-id-type="pmid">17049333</pub-id></citation></ref>
<ref id="B24"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Spina</surname> <given-names>G.</given-names></name> <name><surname>Roberts</surname> <given-names>F.</given-names></name> <name><surname>Weppner</surname> <given-names>J.</given-names></name> <name><surname>Lukowicz</surname> <given-names>P.</given-names></name> <name><surname>Amft</surname> <given-names>O.</given-names></name></person-group> (<year>2013</year>). &#x0201C;<article-title>CRNTC&#x0002B;: a smartphone-based sensor processing framework for prototyping personal healthcare applications</article-title>,&#x0201D; in <conf-name>PH 2013: Proceedings of the 7th International Conference on Pervasive Computing Technologies for Healthcare</conf-name> (<conf-loc>Venice</conf-loc>: <conf-sponsor>IEEE</conf-sponsor>), <fpage>252</fpage>&#x02013;<lpage>255</lpage>.</citation></ref>
<ref id="B25"><citation citation-type="other"><collab>The Vision Council of America</collab>. (<year>2016</year>). <source>Digital Eye Strain Report 2016</source>. Technical Report.</citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>van der Lely</surname> <given-names>S.</given-names></name> <name><surname>Frey</surname> <given-names>S.</given-names></name> <name><surname>Garbazza</surname> <given-names>C.</given-names></name> <name><surname>Wirz-Justice</surname> <given-names>A.</given-names></name> <name><surname>Jenni</surname> <given-names>O. G.</given-names></name> <name><surname>Steiner</surname> <given-names>R.</given-names></name> <etal/></person-group> (<year>2015</year>). <article-title>Blue blocker glasses as a countermeasure for alerting effects of evening light-emitting diode screen exposure in male teenagers</article-title>. <source>J. Adolesc. Health</source> <volume>56</volume>, <fpage>113</fpage>&#x02013;<lpage>119</lpage>.<pub-id pub-id-type="doi">10.1016/j.jadohealth.2014.08.002</pub-id><pub-id pub-id-type="pmid">25287985</pub-id></citation></ref>
<ref id="B27"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Wahl</surname> <given-names>F.</given-names></name> <name><surname>Freund</surname> <given-names>M.</given-names></name> <name><surname>Amft</surname> <given-names>O.</given-names></name></person-group> (<year>2015a</year>). &#x0201C;<article-title>Using smart eyeglasses as a wearable game controller</article-title>,&#x0201D; in <conf-name>Ubicomp 2015: Adjunct Proceedings of the 2015 ACM Conference on Pervasive and Ubiquitous Computing</conf-name> (<conf-loc>Osaka</conf-loc>: <conf-sponsor>ACM</conf-sponsor>), <fpage>377</fpage>&#x02013;<lpage>380</lpage>.</citation></ref>
<ref id="B28"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Wahl</surname> <given-names>F.</given-names></name> <name><surname>Freund</surname> <given-names>M.</given-names></name> <name><surname>Amft</surname> <given-names>O.</given-names></name></person-group> (<year>2015b</year>). &#x0201C;<article-title>WISEglass: multi-purpose context-aware smart eyeglasses</article-title>,&#x0201D; in <conf-name>Proceedings of the 2015 ACM International Symposium on Wearable Computers</conf-name> (<conf-loc>Osaka</conf-loc>: <conf-sponsor>ACM</conf-sponsor>), <fpage>159</fpage>&#x02013;<lpage>160</lpage>.</citation></ref>
<ref id="B29"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Wahl</surname> <given-names>F.</given-names></name> <name><surname>Freund</surname> <given-names>M.</given-names></name> <name><surname>Amft</surname> <given-names>O.</given-names></name></person-group> (<year>2015c</year>). &#x0201C;<article-title>WISEglass: smart eyeglasses recognising context</article-title>,&#x0201D; in <conf-name>Bodynets 2015: Proceedings of the International Conference on Body Area Networks</conf-name> (<conf-loc>Sydney</conf-loc>: <conf-sponsor>ACM</conf-sponsor>), <fpage>11</fpage>&#x02013;<lpage>17</lpage>.</citation></ref>
<ref id="B30"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Wahl</surname> <given-names>F.</given-names></name> <name><surname>Kantermann</surname> <given-names>T.</given-names></name> <name><surname>Amft</surname> <given-names>O.</given-names></name></person-group> (<year>2014</year>). &#x0201C;<article-title>How much light do you get? Estimating daily light exposure using smartphones</article-title>,&#x0201D; in <conf-name>ISWC 2014: Proceedings of the 2014 ACM International Symposium on Wearable Computers</conf-name> (<conf-loc>Seattle</conf-loc>: <conf-sponsor>ACM</conf-sponsor>), <fpage>43</fpage>&#x02013;<lpage>46</lpage>.</citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wood</surname> <given-names>B.</given-names></name> <name><surname>Rea</surname> <given-names>M. S.</given-names></name> <name><surname>Plitnick</surname> <given-names>B.</given-names></name> <name><surname>Figueiro</surname> <given-names>M. G.</given-names></name></person-group> (<year>2013</year>). <article-title>Light level and duration of exposure determine the impact of self-luminous tablets on melatonin suppression</article-title>. <source>Appl. Ergon.</source> <volume>44</volume>, <fpage>237</fpage>&#x02013;<lpage>240</lpage>.<pub-id pub-id-type="doi">10.1016/j.apergo.2012.07.008</pub-id><pub-id pub-id-type="pmid">22850476</pub-id></citation></ref>
<ref id="B32"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>R.</given-names></name> <name><surname>Bernhart</surname> <given-names>S.</given-names></name> <name><surname>Amft</surname> <given-names>O.</given-names></name></person-group> (<year>2016</year>). &#x0201C;<article-title>Diet eyeglasses: recognising food chewing using EMG and smart eyeglasses</article-title>,&#x0201D; in <conf-name>Proceedings of the International Conference on Wearable and Implantable Body Sensor Networks (BSN&#x02019; 16)</conf-name> (<conf-loc>San Francisco</conf-loc>: <conf-sponsor>IEEE</conf-sponsor>), <fpage>7</fpage>&#x02013;<lpage>12</lpage>.</citation></ref>
</ref-list>
<fn-group>
<fn id="fn1"><p><sup>1</sup><uri xlink:href="http://www.youtube.com/watch?v=z4UDNzXD3qA">http://www.youtube.com/watch?v&#x0003D;z4UDNzXD3qA</uri>.</p></fn>
<fn id="fn2"><p><sup>2</sup><uri xlink:href="http://www.youtube.com/watch?v=HaS4NXxL5Qc">http://www.youtube.com/watch?v&#x0003D;HaS4NXxL5Qc</uri>.</p></fn>
</fn-group>
</back>
</article>