<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<?covid-19-tdm?>
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="brief-report" dtd-version="2.3">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title>Frontiers in Psychology</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Psychol.</abbrev-journal-title>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpsyg.2021.661613</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Perspective</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Flow Immersive: A Multiuser, Multidimensional, Multiplatform Interactive Covid-19 Data Visualization Tool</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>DiBenigno</surname>
<given-names>Michael</given-names>
</name>
<xref rid="aff1" ref-type="aff"><sup>1</sup></xref>
<xref rid="c001" ref-type="corresp"><sup>&#x002A;</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/1299146/overview"/>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Kosa</surname>
<given-names>Mehmet</given-names>
</name>
<xref rid="aff2" ref-type="aff"><sup>2</sup></xref>
<xref rid="c002" ref-type="corresp"><sup>&#x002A;</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/1216943/overview"/>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Johnson-Glenberg</surname>
<given-names>Mina C.</given-names>
</name>
<xref rid="aff2" ref-type="aff"><sup>2</sup></xref>
<xref rid="c003" ref-type="corresp"><sup>&#x002A;</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/305341/overview"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Flow Immersive</institution>, <addr-line>Mountain View, CA</addr-line>, <country>United States</country>
</aff>
<aff id="aff2"><sup>2</sup><institution>Department of Psychology, Arizona State University</institution>, <addr-line>Tempe, AZ</addr-line>, <country>United States</country>
</aff>
<author-notes>
<fn id="fn1" fn-type="edited-by">
<p>Edited by: Kostas Karpouzis, Institute of Communication and Computer Systems, Greece</p></fn>
<fn id="fn2" fn-type="edited-by">
<p>Reviewed by: Dusanka Boskovic, University of Sarajevo, Bosnia and Herzegovina; Eleni Mangina, University College Dublin, Ireland; Youngho Lee, Mokpo National University, South Korea</p></fn>
<corresp id="c001">&#x002A;Correspondence: Michael DiBenigno, <email>michael@flow.gl</email></corresp>
<corresp id="c002">Mehmet Kosa, <email>mkosa@asu.edu</email></corresp>
<corresp id="c003">Mina C. Johnson-Glenberg, <email>mina.johnson@asu.edu</email></corresp>
<fn id="fn3" fn-type="other">
<p>This article was submitted to Human-Media Interaction, a section of the journal Frontiers in Psychology</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>13</day>
<month>05</month>
<year>2021</year>
</pub-date>
<pub-date pub-type="collection">
<year>2021</year>
</pub-date>
<volume>12</volume>
<elocation-id>661613</elocation-id>
<history>
<date date-type="received">
<day>31</day>
<month>01</month>
<year>2021</year>
</date>
<date date-type="accepted">
<day>22</day>
<month>04</month>
<year>2021</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x00A9; 2021 DiBenigno, Kosa and Johnson-Glenberg.</copyright-statement>
<copyright-year>2021</copyright-year>
<copyright-holder>DiBenigno, Kosa and Johnson-Glenberg</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<abstract>
<p>Covid-19 has prompted a surge of data visualizations that have been published for public consumption, yet, many have not had broad appeal or may have not been well-understood by laypeople. A data storytelling platform called Flow Immersive has been created to successfully engage both laypeople and experts in understanding complex information. This tool integrates emerging technologies [e.g., augmented reality (AR) and virtual reality (VR)] with a multiplatform, multiuser publishing approach. From October 2020 to December 2020, Flow&#x2019;s Covid-19 AR videos captured 9 million (9,000,000) views, and have been used in multiple professional presentations. This paper documents the journey from development to deployment, and some user feedback which all led to breakthroughs in scalability and higher levels of engagement.</p>
</abstract>
<kwd-group>
<kwd>data simulation</kwd>
<kwd>data visualization</kwd>
<kwd>Covid-19</kwd>
<kwd>augmented reality</kwd>
<kwd>virtual reality</kwd>
<kwd>spatial design</kwd>
</kwd-group>
<counts>
<fig-count count="7"/>
<table-count count="1"/>
<equation-count count="0"/>
<ref-count count="37"/>
<page-count count="9"/>
<word-count count="4612"/>
</counts>
</article-meta>
</front>
<body>
<sec id="sec1" sec-type="intro">
<title>Introduction</title>
<p>Design is the thoughtful organization of resources to accomplish a goal. It is both a process (i.e., set of activities) and a product (<xref ref-type="bibr" rid="ref17">Hevner et al., 2004</xref>). During design and iteration, the search is about finding satisfactory solutions, i.e., satisficing without explicitly specifying all possible solutions (<xref ref-type="bibr" rid="ref32">Simon, 1996</xref>). The design of data visualization has multiple solutions. We find it an intriguing and important space to work in, because data visualization literacy is a crucial skill for an educated populace (<xref ref-type="bibr" rid="ref5">B&#x00F6;rner et al., 2019</xref>). Visual and spatial presentation of health data is gaining prominence in general (<xref ref-type="bibr" rid="ref8">Cical&#x00F2; and Valentino, 2019</xref>), and it seems more timely than ever due to the Covid-19 pandemic because large segments of society that never thought much about virus transmission now want to understand how individuals and communities can best mitigate the pandemic&#x2019;s toll. As people better understand data visualizations related to Covid-19, they may positively change their behaviors. In this paper, we describe a tool that visualizes up-to-date Covid-19 data and makes the three-dimensional (3D) information accessible for the public. We describe the multitude of ways Flow Immersive visualizations (called <italic>Flows</italic>) have been presented, and the impact of these various presentations.</p>
<p>While 3D is one feature, there are other design considerations made in the development of visualization tools. One consideration is utilizing scale to show the &#x201C;zoomed out&#x201D; big picture view, and then allowing users to then &#x201C;zoom in&#x201D; to the details. One can see the forest <italic>and</italic> the trees. Flow was designed so that data points do not disappear and reappear but instead move from one perspective, or graph, to another. Lastly, the use of steps allows participants to follow a linear storyline. At each step, participants can click on dots (data points), and pull-up additional details that aid in understanding a specific attribute. These feature and design decisions make the tool highly interactive and profoundly extensible.</p>
<p>As of this writing (February, 2021), the world is still inundated with daily, new content aimed at informing the general public about Covid-19-related topics. These visualizations can miss their mark by sometimes being confusing, or misunderstood by the layperson (see <xref ref-type="bibr" rid="ref30">Romano et al., 2020</xref> for comparison on linear and log scales). Many visualizations use traditional widgets such as two-dimensional (2D) bar and pie charts (which notoriously misrepresent area, see <xref ref-type="bibr" rid="ref9">Cleveland, 1987</xref>). We have found several that are limited to certain regions, cities, and sometimes specific to only one state, city, or county (<xref ref-type="bibr" rid="ref2">Arneson et al., 2020</xref>; <xref ref-type="bibr" rid="ref21">Kaul et al., 2020</xref>; <xref ref-type="bibr" rid="ref27">Mondal et al., 2020</xref>; <xref ref-type="bibr" rid="ref34">Teb&#x00E9; et al., 2020</xref>; <xref ref-type="bibr" rid="ref39">Zuo et al., 2020</xref>). The goal of the 3D data storytelling tool, Flow Immersive, was to create an extensible, generalizable Covid-19 visualization interface that would not be location specific and which would be accessible to multiple users through multiple devices or platforms.</p>
</sec>
<sec id="sec2">
<title>Adding a Third Dimension</title>
<p>Three-dimensional data visualization tools have been implemented for augmented reality (AR) and virtual reality (VR) in several domains such as civil engineering, industrial engineering, construction, and science in general (<xref ref-type="bibr" rid="ref7">Bryson, 1996</xref>; <xref ref-type="bibr" rid="ref6">Bruno et al., 2006</xref>; <xref ref-type="bibr" rid="ref31">Schall et al., 2009</xref>; <xref ref-type="bibr" rid="ref13">Donalek et al., 2014</xref>). Similarly, the engaging immersivity of AR and VR has increased learning in topics as varied as history (<xref ref-type="bibr" rid="ref4">Blazauskas et al., 2017</xref>), medicine (<xref ref-type="bibr" rid="ref29">Pennefather and Krebs, 2019</xref>), biology (<xref ref-type="bibr" rid="ref24">Lee et al., 2010</xref>), and natural selection (<xref ref-type="bibr" rid="ref20">Johnson-Glenberg et al., 2020</xref>). Several studies report that greater learning gains occur when content is learned in 3D VR compared to 2D desktop versions (<xref ref-type="bibr" rid="ref1">Allcoat and von M&#x00FC;hlenen, 2018</xref>; <xref ref-type="bibr" rid="ref16">Greenwald et al., 2018</xref>; <xref ref-type="bibr" rid="ref22">Krokos et al., 2019</xref>; <xref ref-type="bibr" rid="ref20">Johnson-Glenberg et al., 2020</xref>; <xref ref-type="bibr" rid="ref38">Wu et al., 2020</xref>). Additionally, visualizing data in AR and VR can increase user collaboration (<xref ref-type="bibr" rid="ref26">Meiguins et al., 2006</xref>) and aid in eliminating the fish tank problem (looking at the data behind a glass; <xref ref-type="bibr" rid="ref36">Thomas et al., 2014</xref>). Providing immersive experiences consequently increases the understanding of the data (<xref ref-type="bibr" rid="ref3">Bayyari and Tudoreanu, 2006</xref>). Adding a third dimension spatially reveals critical information that is otherwise lost with flat choropleths that use color bin ranges. For example, in a 2D map of America, showing votes cast by county (see <xref rid="fig1" ref-type="fig">Figure 1</xref>, left panel), it would appear that America is a predominantly red (Republican) country. However, land does not vote. When the third dimension, a height axis, is added, it reveals the number of votes in an area. It becomes evident that large cities are primarily blue (<xref rid="fig1" ref-type="fig">Figure 1</xref>, right panel). It is the actions of the inhabitants that drive the total countrywide count.</p>
<fig position="float" id="fig1">
<label>Figure 1</label>
<caption>
<p>
<bold>Panel on the left</bold> is two-dimensional (2D) and makes the United States appear to have mainly voted Republican (red) in 2020; <bold>Panel on the right</bold> with a third dimension of number of people is able to highlight the difference between rural and city voting patterns, as seen by the blue spikes over cities. This Flow story was shown very soon after the November, 2020 election to spur discussion around the difference in voting patterns between rural and urban areas.</p>
</caption>
<graphic xlink:href="fpsyg-12-661613-g001.tif"/>
</fig>
<p>Immersion can be linked to the number of dimensions available to a viewer (<xref ref-type="bibr" rid="ref33">Slater, 2003</xref>). We predict that the more immersed people are, the more motivated and engaged they will be (<xref ref-type="bibr" rid="ref25">Makransky et al., 2019</xref>), and the better they should comprehend the Covid-19 data. Therefore, one other aim in our design was to make sure our tool was usable on the emerging and highly immersive technologies of AR and VR, both with mobile and head-mounted displays (HMD&#x2019;s).</p>
</sec>
<sec id="sec3">
<title>The Tool: Flow Immersive</title>
<p>Flow Immersive, the data visualization design tool that was founded in 2016, enables users to more effectively communicate complex information and data stories to both expert and non-expert audiences.<xref rid="fn0001" ref-type="fn"><sup>1</sup></xref> The tool enables users to tell data-driven narratives in an engaging, interactive, and understandable way, and is capable of creating a diverse set of data simulations (<xref ref-type="bibr" rid="ref14">Flow Immersive, 2020</xref>).<xref ref-type="fn" rid="fn0002"><sup>2</sup></xref>
</p>
<p>To date, Flow Immersive has created and published 17 interactive, 3D, Covid-19 data simulations that work across the following platforms: computer screens, mobile phones, VR headsets, AR mobile, and AR headsets. These simulations, called <italic>Flows</italic>, can be published as videos, images, GIFS, standalone visualizations, or immersive multiuser experiences. Additionally, these simulations work natively in a web browser (i.e., no separate application download needed, because <italic>Flow</italic> is deployed <italic>via</italic> the WebXR platform).</p>
<p>Over the course of the 11-month timeframe since we began publishing Covid-19 data simulations, many design decisions were made intending to manage the tension between balancing reach (i.e., number of views) and engagement (i.e., active time spent). Here, we chronicle some of that journey from the Covid-19 simulations introduction in late January, 2020 to the simulations current usage &#x2013; as of early January 2021. <xref rid="fig2" ref-type="fig">Figure 2</xref> showcases the story of how different states have different predicted peak hospitalization times. Although data dashboards provide ample details, <italic>Flow&#x2019;s</italic> immersive and interactive data stories may be a better method for guiding non-experts to understanding the insights in the data. The &#x201C;story&#x201D; is about how different states have different predicted peak hospitalization dates. Here, we explore the first signs of the curve flattening in the United States, but history forewarned us with the 1918 Spanish Flu that the second and third waves caused more deaths than the first wave (not pictured here). The figure shows predictive models from the Institute for Health Metrics and Evaluation (IHME). Highlighted in this data story are predicted peak hospital bed utilizations. These differ depending on the individual state in the United States.</p>
<fig position="float" id="fig2">
<label>Figure 2</label>
<caption>
<p>Example of a predictive simulation of augmented reality (AR) Covid-19 data for multiple states in the United States. The <italic>X</italic>-axis is time, the month of March is visible by the presenter&#x2019;s leg; the <italic>Y</italic>-axis is number of new cases based on previous 3 days, and the <italic>Z</italic>-axis is actually a series of lines representing multiple individual states in the United States.</p>
</caption>
<graphic xlink:href="fpsyg-12-661613-g002.tif"/>
</fig>
<p>In general, we saw that as the presentation format becomes more immersive (moving into augmented and virtual reality platforms), the reach (i.e., number of views) decreases and engagement (i.e., duration of the view) increases. Similarly, as the format becomes simpler and more accessible, the reach of the visualization increases and, yet, the engagement decreases. As an example, one of our early visualizations was made into an animated GIF and viewed over 310,000 times (<xref ref-type="bibr" rid="ref18">Infinitemoment22, 2020</xref>), whereas the same visualization with user-controlled interactive capabilities on the website was viewed only 12,000 times. The numbers continue to drop as the platforms becomes less ubiquitous; around 300 people viewed it in VR headsets, and fewer than 20 people viewed it on AR headsets (the most commonly used AR platform was the <italic>Magic Leap One</italic>). Although the GIF duration was only 1 min, those in-headset spent an average of around 3 min exploring the content. The difference in terms of reach is illustrated in <xref rid="fig3" ref-type="fig">Figure 3</xref>.</p>
<fig position="float" id="fig3">
<label>Figure 3</label>
<caption>
<p>User reach in different formats for one Flow Immersive visualization.</p>
</caption>
<graphic xlink:href="fpsyg-12-661613-g003.tif"/>
</fig>
</sec>
<sec id="sec4">
<title>Usage in January: Beginning of the Covid-19 Pandemic</title>
<p>The GIF and interactive data visualization showing the spread of the disease over time appeared to be impactful. However, <italic>via</italic> viewer feedback, it became clear that the narrative is what really drives viewer engagement, rather than the visualization itself. The decision was made to rapidly record an AR session and to &#x201C;walk the viewers through&#x201D; the content with speech and the storyteller&#x2019;s gestures. The vast majority of responses were positive and the number of views on <italic>LinkedIn</italic> quickly reached 2,000 in a non-advertised manner (<xref ref-type="bibr" rid="ref10">DiBenigno, 2020a</xref>; <xref rid="fig4" ref-type="fig">Figure 4</xref>). <xref rid="fig4" ref-type="fig">Figure 4</xref> shows that the relative impact of Covid-19 was mostly in China and Italy. Several viewers commented that the rate by which Italy was increasing was &#x201C;shocking.&#x201D; It would be more difficult to get these responses with a 2D heat map, especially for those who are colorblind.</p>
<fig position="float" id="fig4">
<label>Figure 4</label>
<caption>
<p>A snapshot from the AR presentation showing Covid-19 early March 2020 cases geospatially on an AR map. Height of spike corresponds to confirmed cases.</p>
</caption>
<graphic xlink:href="fpsyg-12-661613-g004.tif"/>
</fig>
<p>Ten additional <italic>Flows</italic> were then created in this same spirit, averaging approximately 3 min in length. Videos were captured and shared <italic>via</italic> AR on a smartphone.</p>
</sec>
<sec id="sec6">
<title>The Amazing Reach Of <italic>Tiktok</italic></title>
<p><italic>TikTok</italic> is a social media platform that is driven by algorithms that prioritize content virality over an established following. As a result, content has the possibility to reach a very large number of people without having a solidified following of viewers. This &#x201C;viral&#x201D;-focused platform enabled Flow Immersive to reach a much larger audience. One such example of how a Flow extended beyond its previous presentations is a <italic>TikTok</italic> video where disease propagation of Covid-19 is addressed (<xref ref-type="bibr" rid="ref11">DiBenigno, 2020b</xref>). This <italic>TikTok</italic> video received over 2 million views over the span of 7 days (<xref rid="fig5" ref-type="fig">Figure 5</xref>). In this specific video, it is explained that in fact, 80&#x2013;90% of Covid-19 infections is caused by 10&#x2013;20% of people. This network graph allowed us to visualize the difference between a mathematical model for disease whereby each node spawns two additional nodes vs. a network graph where only 10% of the nodes was responsible for 80&#x2013;90% of the spread, painting a much different picture.</p>
<fig position="float" id="fig5">
<label>Figure 5</label>
<caption>
<p>A snapshot from the <italic>TikTok</italic> animation illustrating how super-spreader events were driving Covid-19 infections. The two blue nodes represent the first two cases.</p>
</caption>
<graphic xlink:href="fpsyg-12-661613-g005.tif"/>
</fig>
<p>Soon thereafter, our disease transmission video was presented in a large flat-screen format by William McKeon, President and CEO Texas Medical Center. That <italic>Flow</italic> continues to live on the platform as an interactive visualization (<xref ref-type="bibr" rid="ref35">Texas Medical Center, 2020</xref>; <xref rid="fig6" ref-type="fig">Figure 6</xref>). The Medical Center story was similar. The presenter walked viewers through how a virus with a certain R naught can propagate through a population. He chose this as one of the best illustrations of how disease spreads and why it is important to stop super spreader events.</p>
<fig position="float" id="fig6">
<label>Figure 6</label>
<caption>
<p>A YouTube snapshot from the Texas Medical Center&#x2019;s &#x201C;State of the TMC&#x201D; event.</p>
</caption>
<graphic xlink:href="fpsyg-12-661613-g006.tif"/>
</fig>
</sec>
<sec id="sec27">
<title>Flow Immersive Usage: End of 2020</title>
<p>With the exponential reach of our videos featuring AR <italic>via TikTok</italic>, we felt more empowered to explore content on XR devices. Instead of conceiving of <italic>Flow</italic> as one specific format, we designed it to be used as a funnel to draw people into more and more immersive experiences. Users claimed that our system was intuitive and helpful. Over the course of the year, <italic>Flows</italic> have been viewed over 9 million times.<xref rid="fn0003" ref-type="fn"><sup>3</sup></xref> Below is an unsolicited, posted anecdote from a user in an AR headset (<xref ref-type="bibr" rid="ref23">Lang, 2020</xref>):</p>
<disp-quote>
<p>
<italic>&#x201C;&#x2026; Using a spatial visualization like this and stepping through day by day supplements our intuitive perception and makes the difference obvious&#x2026;&#x201D;</italic>
</p>
</disp-quote>
<p>The team at Flow Immersive is currently refining the web-based editor that allows anyone to build their own data story. There is growing interest, and an expanding waitlist, for the future Flow Immersive Editor. The Editor allows anyone without coding skills to enter data and begin creating a &#x201C;story.&#x201D; The Flow Immersive team is currently ramping up support for this feature. <xref rid="fig7" ref-type="fig">Figure 7</xref> shows a screenshot of the Editor.</p>
<fig position="float" id="fig7">
<label>Figure 7</label>
<caption>
<p>Flow Immersive Web-Based Editor. This is how all of the visualizations and interactive experiences are created.</p>
</caption>
<graphic xlink:href="fpsyg-12-661613-g007.tif"/>
</fig>
<p>The largest takeaway from this endeavor is that there is a strong desire in the population (which includes a non-technical audience) to engage with, present, and really understand data. Epidemiologists often have the right information, but if it is not effectively communicated, it loses public health value. <italic>Flow</italic> is a pioneering method to increase engagement around 3D data visualizations and personalize presentations. We acknowledge that visualizing data in AR/VR has its own challenges, such as screen limitations and tailored interface design (<xref ref-type="bibr" rid="ref28">Olshannikova et al., 2015</xref>), and we aim to overcome some of those in our future work.</p>
</sec>
<sec id="sec7">
<title>Qualitative User Feedback</title>
<p>To better understand <italic>Flow&#x2019;s</italic> uses, a survey was sent to 40 people in March 2020. A total of 16 consented to evaluate the tool with the survey consisting of open-ended questions. One of the participants stated that they did not experience <italic>Flow</italic> with Covid-19 content and, therefore, was not included below. Of the 15, nine participants stated that they had specific Covid-19 questions they wanted answered within <italic>Flow</italic>. Among the ones that had prior Covid-19 questions in mind, the most common were: comparison by state (latest data and over time), geographical vaccine distribution, cities with high infection rates, comparison with other countries, and the results of hypothetical cases where stricter measures were taken. <xref rid="tab1" ref-type="table">Table 1</xref> highlights some of the results.</p>
<table-wrap position="float" id="tab1">
<label>Table 1</label>
<caption>
<p>Survey information of interest from 15 responders.</p>
</caption>
<table frame="hsides" rules="groups">
<tbody>
<tr>
<td align="left" valign="top" rowspan="2">Type of experience</td>
<td align="left" valign="top">Longer form video</td>
<td align="center" valign="top">12</td>
</tr>
<tr>
<td align="left" valign="top">Shorter form video</td>
<td align="center" valign="top">11</td>
</tr>
<tr>
<td align="left" valign="top" rowspan="4">Type of platform</td>
<td align="left" valign="top">Phone</td>
<td align="center" valign="top">6</td>
</tr>
<tr>
<td align="left" valign="top">Desktop</td>
<td align="center" valign="top">7</td>
</tr>
<tr>
<td align="left" valign="top">VR</td>
<td align="center" valign="top">2</td>
</tr>
<tr>
<td align="left" valign="top">AR</td>
<td align="center" valign="top">2</td>
</tr>
<tr>
<td align="left" valign="top" rowspan="2">Specific Covid-19 related questions?</td>
<td align="left" valign="top">Yes</td>
<td align="center" valign="top">9</td>
</tr>
<tr>
<td align="left" valign="top">No</td>
<td align="center" valign="top">6</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>When queried how <italic>Flow</italic> is different from other, more traditional and non-interactive visualizations, two clusters of answers emerged: (1) quality of interaction and (2) learning. Users found <italic>Flow</italic> to be a rich experience thanks to the storytelling aspects with personalized experiences. Some illustrative replies are listed below:</p>
<list list-type="order">
<list-item>
<p>1. Quality of Interaction:</p>
<list list-type="simple">
<list-item>
<p>&#x201C;Made me watch it longer than I would have, more memorable.&#x201D;</p>
</list-item>
<list-item>
<p>&#x201C;A <italic>Flow</italic> chart immerses you in the content!&#x201D;</p>
</list-item>
<list-item>
<p>&#x201C;It was nice to see the graphs in such an immersive way.&#x201D;</p>
</list-item>
</list>
</list-item>
<list-item>
<p>2. Learning:</p>
<list list-type="simple">
<list-item>
<p>&#x201C;Much richer learning experience.&#x201D;</p>
</list-item>
<list-item>
<p>&#x201C;More digestible data can be one screen.&#x201D;</p>
</list-item>
<list-item>
<p>&#x201C;It simplifies my understanding.&#x201D;</p>
</list-item>
<list-item>
<p>&#x201C;Greater understanding, easier to remember.&#x201D;</p>
</list-item>
<list-item>
<p>&#x201C;See data in a new way.&#x201D;</p>
</list-item>
<list-item>
<p>&#x201C;Liked how the time aspect was displayed by real-time graph evolution.&#x201D;</p>
</list-item>
</list>
</list-item>
</list>
<p>When asked how <italic>Flow</italic> helped them, participants&#x2019; reactions were uniformly positive. They found <italic>Flow</italic> to be visually pleasing, entertaining, and easy to use in general:</p>
<disp-quote>
<p>&#x201C;Visually engaging&#x201D;</p>
<p>&#x201C;Great graphics&#x201D;</p>
<p>&#x201C;It&#x2019;s entertaining more than 2d graphs&#x201D;</p>
<p>&#x201C;Visualizations were also clear and easy to read.&#x201D;</p>
</disp-quote>
<p>Lastly, regarding the future of Flow Immersive, some responders asked to see its integration into other platforms, like multi-user AR. We are exploring how to get all users interactive in a multi-user space. They also asked for extrapolations and predictions, and the ability to direct attention to certain local data points:</p>
<disp-quote>
<p>&#x201C;Integrations into enterprise data visualization platforms with tools to create and publish stories with the data.&#x201D;</p>
<p>&#x201C;Would be cool for two people in different locations to virtually occupy the same AR space.&#x201D;</p>
<p>&#x201C;If you speak about certain data points have those visually &#x2018;flash&#x2019; to quickly help viewer find them.&#x201D;</p>
</disp-quote>
</sec>
<sec id="sec8">
<title>Future Work</title>
<p>It has been anecdotally reported that some people hold the misconception that by simply putting data in a 3D space, major insights will instantly be grasped by the user. In fact, the affordances of 3D can lead to cognitive overload and confusion when the content is not well-scaffolded (<xref ref-type="bibr" rid="ref19">Johnson-Glenberg, 2018</xref>). Nonetheless, thoughtful design that takes into account the special affordances of 3D space, should result in an increase in comprehension for spatially complex content. We posit that it is possible to dynamically show both the forest and the trees, i.e., the big picture, as well as the minute details.</p>
<p>The future for Flow Immersive is less about self-produced content to draw interest, and is more about empowering anyone to be able to tell their own data story with the editor. It is data democratization. With the recent wave of more affordable AR and VR devices, viewing 3D content is becoming more prevalent. A user going into a 3D environment to simply look at 2D content could be missing rich and vital information. VR headset adoption is slowly picking up. While the majority of users view <italic>Flows</italic> on flat screens, or in AR mode on mobile devices, with time, users will be able to create and then experience the data in a more immersive manner.</p>
</sec>
<sec id="sec9" sec-type="conclusions">
<title>Conclusion</title>
<p>This article introduces Flow Immersive &#x2013; an interactive multi-user, multi-dimensional, multi-platform data storytelling application. During the Covid-19 pandemic, it has been used by millions to better understand disease transmission. Although not everyone had the opportunity to experience the simulations in emerging AR or VR, the tool still managed to capture wide audience attention <italic>via</italic> other formats. The tool shows promise in terms of engagement and understanding, especially when people are &#x201C;immersed in data.&#x201D; This is because as the format becomes more immersive, the engagement (time in the experience) also increases. The exponential increase in use afforded by the <italic>TikTok</italic> platform was unforeseen, yet exciting. The novel technology of <italic>Flow</italic> is being increasingly used by stakeholders and the wider public to understand, and to make decisions about Covid-19. These sorts of data visualizations hold promise for all types of data and may arguably result in increased data visualization literacy over time.</p>
</sec>
<sec id="sec10">
<title>Data Availability Statement</title>
<p>The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding authors.</p>
</sec>
<sec id="sec11">
<title>Ethics Statement</title>
<p>Written informed consent was obtained from the individual(s) for the publication of any potentially identifiable images or data included in this article.</p>
</sec>
<sec id="sec12">
<title>Author Contributions</title>
<p>MD: tool design and implementation and overall writing. MK: literature writing and reviewing. MJ-G: reviewing and supervising. All authors contributed to the article and approved the submitted version.</p>
<sec sec-type="COI-statement" id="conf1">
<title>Conflict of Interest</title>
<p>MD is the cofounder of the company Flow Immersive.The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</sec>
</body>
<back>
<ack>
<p>We would like to thank Jason Marsh and the rest of the Flow Immersive team for their contributions, supports, and their work on the enabling technology for this project.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="ref1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Allcoat</surname> <given-names>D.</given-names></name> <name><surname>von M&#x00FC;hlenen</surname> <given-names>A.</given-names></name></person-group> (<year>2018</year>). <article-title>Learning in virtual reality: effects on performance, emotion and engagement</article-title>. <source>Res. Learn. Technol.</source> <volume>26</volume>:<fpage>2140</fpage>. doi: <pub-id pub-id-type="doi">10.25304/rlt.v26.2140</pub-id></citation></ref>
<ref id="ref2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Arneson</surname> <given-names>D.</given-names></name> <name><surname>Elliott</surname> <given-names>M.</given-names></name> <name><surname>Mosenia</surname> <given-names>A.</given-names></name> <name><surname>Oskotsky</surname> <given-names>B.</given-names></name> <name><surname>Solodar</surname> <given-names>S.</given-names></name> <name><surname>Vashisht</surname> <given-names>R.</given-names></name> <etal/></person-group>. (<year>2020</year>). <article-title>CovidCounties is an interactive real time tracker of the COVID19 pandemic at the level of US counties</article-title>. <source>Sci. Data</source> <volume>7</volume>:<fpage>405</fpage>. doi: <pub-id pub-id-type="doi">10.1038/s41597-020-00731-8</pub-id>, PMID: <pub-id pub-id-type="pmid">33199721</pub-id></citation></ref>
<ref id="ref3"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Bayyari</surname> <given-names>A.</given-names></name> <name><surname>Tudoreanu</surname> <given-names>M. E.</given-names></name></person-group> (<year>2006</year>). &#x201C;The impact of immersive virtual reality displays on the understanding of data visualization.&#x201D; in <italic>Proceedings of the ACM symposium on Virtual reality software and technology</italic>; November 1-3, 2006; 368&#x2013;371.</citation></ref>
<ref id="ref4"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Blazauskas</surname> <given-names>T.</given-names></name> <name><surname>Maskeliunas</surname> <given-names>R.</given-names></name> <name><surname>Bartkute</surname> <given-names>R.</given-names></name> <name><surname>Kersiene</surname> <given-names>V.</given-names></name> <name><surname>Jurkeviciute</surname> <given-names>I.</given-names></name> <name><surname>Dubosas</surname> <given-names>M.</given-names></name></person-group> (<year>2017</year>). &#x201C;Virtual reality in education: new ways to learn.&#x201D; in <italic>International Conference on Information and Software Technologies</italic>; March 12-15, 2017; Springer, Cham, 457&#x2013;465.</citation></ref>
<ref id="ref5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>B&#x00F6;rner</surname> <given-names>K.</given-names></name> <name><surname>Bueckle</surname> <given-names>A.</given-names></name> <name><surname>Ginda</surname> <given-names>M.</given-names></name></person-group> (<year>2019</year>). <article-title>Data visualization literacy: definitions, conceptual frameworks, exercises, and assessments</article-title>. <source>Proc. Natl. Acad. Sci. U. S. A.</source> <volume>116</volume>, <fpage>1857</fpage>&#x2013;<lpage>1864</lpage>. doi: <pub-id pub-id-type="doi">10.1073/pnas.1807180116</pub-id>, PMID: <pub-id pub-id-type="pmid">30718386</pub-id></citation></ref>
<ref id="ref6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bruno</surname> <given-names>F.</given-names></name> <name><surname>Caruso</surname> <given-names>F.</given-names></name> <name><surname>De Napoli</surname> <given-names>L.</given-names></name> <name><surname>Muzzupappa</surname> <given-names>M.</given-names></name></person-group> (<year>2006</year>). <article-title>Visualization of industrial engineering data visualization of industrial engineering data in augmented reality</article-title>. <source>J. Visual.</source> <volume>9</volume>, <fpage>319</fpage>&#x2013;<lpage>329</lpage>. doi: <pub-id pub-id-type="doi">10.1007/BF03181679</pub-id></citation></ref>
<ref id="ref7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bryson</surname> <given-names>S.</given-names></name></person-group> (<year>1996</year>). <article-title>Virtual reality in scientific visualization</article-title>. <source>Commun. ACM</source> <volume>39</volume>, <fpage>62</fpage>&#x2013;<lpage>71</lpage>. doi: <pub-id pub-id-type="doi">10.1145/229459.229467</pub-id></citation></ref>
<ref id="ref8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cical&#x00F2;</surname> <given-names>E.</given-names></name> <name><surname>Valentino</surname> <given-names>M.</given-names></name></person-group> (<year>2019</year>). <article-title>Mapping and visualisation on of health data. The contribution on of the graphic sciences to medical research from New York yellow fever to China coronavirus</article-title>. <source>Disegnarecon</source> <volume>12</volume>, <fpage>12</fpage>&#x2013;<lpage>21</lpage>. doi: <pub-id pub-id-type="doi">10.20365/disegnarecon.23.2019.12</pub-id></citation></ref>
<ref id="ref9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cleveland</surname> <given-names>W. S.</given-names></name></person-group> (<year>1987</year>). <article-title>Research in statistical graphics</article-title>. <source>J. Am. Stat. Assoc.</source> <volume>82</volume>, <fpage>419</fpage>&#x2013;<lpage>423</lpage>. doi: <pub-id pub-id-type="doi">10.1080/01621459.1987.10478444</pub-id></citation></ref>
<ref id="ref10"><citation citation-type="other"><person-group person-group-type="author"><name><surname>DiBenigno</surname> <given-names>M.</given-names></name></person-group> (<year>2020a</year>). Not an expert, but sharing some insights on #Coronavirus using Flow Immersive, Inc.&#x2019;s remote meeting tech. No post-production here. [Post]. LinkedIn. Available at: <ext-link xlink:href="https://www.linkedin.com/posts/michaeldibenigno_coronavirus-augmentedreality-remotework-activity-6641370618115706880-K8p1/" ext-link-type="uri">https://www.linkedin.com/posts/michaeldibenigno_coronavirus-augmentedreality-remotework-activity-6641370618115706880-K8p1/</ext-link> (Accessed April 28, 2021).</citation></ref>
<ref id="ref11"><citation citation-type="other"><person-group person-group-type="author"><name><surname>DiBenigno</surname> <given-names>M.</given-names></name></person-group> (<year>2020b</year>). [@the.data.guy]. Thank you @epidemiologistkat for continuing to communicate research driven insights&#x2026; I just help visualize them and share a story. #datastory #AR [Video]. TikTok. Available at: <ext-link xlink:href="https://www.tiktok.com/@the.data.guy/video/6891058199525444869?lang=en" ext-link-type="uri">https://www.tiktok.com/@the.data.guy/video/6891058199525444869?lang=en</ext-link> (Accessed March 11, 2020).</citation></ref>
<ref id="ref13"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Donalek</surname> <given-names>C.</given-names></name> <name><surname>Djorgovski</surname> <given-names>S. G.</given-names></name> <name><surname>Cioc</surname> <given-names>A.</given-names></name> <name><surname>Wang</surname> <given-names>A.</given-names></name> <name><surname>Zhang</surname> <given-names>J.</given-names></name> <name><surname>Lawler</surname> <given-names>E.</given-names></name> <etal/></person-group>. (<year>2014</year>). &#x201C;Immersive and collaborative data visualization using virtual reality platforms.&#x201D; in <italic>2014 IEEE International Conference on Big Data (Big Data)</italic>. IEEE; October 27-30, 2014; (609&#x2013;614).</citation></ref>
<ref id="ref14"><citation citation-type="other"><person-group person-group-type="author"><collab id="coll1">Flow Immersive</collab></person-group> (<year>2020</year>). UNDP Teaser [Video]. YouTube. Available at: <ext-link xlink:href="https://www.youtube.com/watch?v=i9iBde7frrM" ext-link-type="uri">https://www.youtube.com/watch?v=i9iBde7frrM</ext-link> (Accessed June 2, 2020).</citation></ref>
<ref id="ref16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Greenwald</surname> <given-names>S. W.</given-names></name> <name><surname>Corning</surname> <given-names>W.</given-names></name> <name><surname>Funk</surname> <given-names>M.</given-names></name> <name><surname>Maes</surname> <given-names>P.</given-names></name></person-group> (<year>2018</year>). <article-title>Comparing learning in virtual reality with learning on a 2D screen using electrostatics activities</article-title>. <source>J. Univ. Comput. Sci.</source> <volume>24</volume>, <fpage>220</fpage>&#x2013;<lpage>245</lpage>. doi: <pub-id pub-id-type="doi">10.3217/jucs-024-02-0220</pub-id></citation></ref>
<ref id="ref17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hevner</surname> <given-names>A. R.</given-names></name> <name><surname>March</surname> <given-names>S. T.</given-names></name> <name><surname>Park</surname> <given-names>J.</given-names></name> <name><surname>Ram</surname> <given-names>S.</given-names></name></person-group> (<year>2004</year>). <article-title>Design science in information systems research</article-title>. <source>MIS Q.</source> <volume>28</volume>, <fpage>75</fpage>&#x2013;<lpage>105</lpage>. doi: <pub-id pub-id-type="doi">10.2307/25148625</pub-id></citation></ref>
<ref id="ref18"><citation citation-type="other"><person-group person-group-type="author"><collab id="coll3">Infinitemoment22</collab></person-group> (<year>2020</year>). Coronavirus Tracker in AR from the Browser&#x2014;Flow.GL GIF. Available at: <ext-link xlink:href="https://www.gfycat.com/acrobaticscrawnyangora-coronavirus" ext-link-type="uri">https://www.gfycat.com/acrobaticscrawnyangora-coronavirus</ext-link> (Accessed February 6, 2020).</citation></ref>
<ref id="ref19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Johnson-Glenberg</surname> <given-names>M. C.</given-names></name></person-group> (<year>2018</year>). <article-title>Immersive VR and education: embodied design principles that include gesture and hand controls</article-title>. <source>Front. Robot. AI</source> <volume>5</volume>:<fpage>81</fpage>. doi: <pub-id pub-id-type="doi">10.3389/frobt.2018.00081</pub-id>, PMID: <pub-id pub-id-type="pmid">33500960</pub-id></citation></ref>
<ref id="ref20"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Johnson-Glenberg</surname> <given-names>M. C.</given-names></name> <name><surname>Su</surname> <given-names>M.</given-names></name> <name><surname>Bartolomea</surname> <given-names>H.</given-names></name> <name><surname>Ly</surname> <given-names>V.</given-names></name> <name><surname>Nieland Zavala</surname> <given-names>R.</given-names></name> <name><surname>Kalina</surname> <given-names>E.</given-names></name></person-group> (<year>2020</year>). Embodied agentic STEM education: effects of 3D VR compared to 2D PC. Paper presented at the Immersive Learning Research Network (iLRN) 2020 Virtual Conference.</citation></ref>
<ref id="ref21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kaul</surname> <given-names>S.</given-names></name> <name><surname>Coleman</surname> <given-names>C.</given-names></name> <name><surname>Gotz</surname> <given-names>D.</given-names></name></person-group> (<year>2020</year>). <article-title>A rapidly deployed, interactive, online visualization system to support fatality management during the coronavirus disease 2019 (Covid-19) pandemic</article-title>. <source>J. Am. Med. Inform. Assoc.</source> <volume>27</volume>, <fpage>1943</fpage>&#x2013;<lpage>1948</lpage>. doi: <pub-id pub-id-type="doi">10.1093/jamia/ocaa146</pub-id>, PMID: <pub-id pub-id-type="pmid">33040152</pub-id></citation></ref>
<ref id="ref22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Krokos</surname> <given-names>E.</given-names></name> <name><surname>Plaisant</surname> <given-names>C.</given-names></name> <name><surname>Varshney</surname> <given-names>A.</given-names></name></person-group> (<year>2019</year>). <article-title>Virtual memory palaces: immersion aids recall</article-title>. <source>Virtual Reality</source> <volume>23</volume>, <fpage>1</fpage>&#x2013;<lpage>15</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s10055-018-0346-3</pub-id></citation></ref>
<ref id="ref23"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Lang</surname> <given-names>D.</given-names></name></person-group> (<year>2020</year>). Screen capture from my Magic Leap 1 of the Flow.GL data visualization of the Coronavirus outbreak. Humans have exceptionally bad [Thumbnail with link attached] [Post]. LinkedIn. Available at: <ext-link xlink:href="https://www.linkedin.com/posts/dennis-lang-a7478787_magicleap-spatialcomputing-datamodeling-activity-6629805853799182337-Pc_K" ext-link-type="uri">https://www.linkedin.com/posts/dennis-lang-a7478787_magicleap-spatialcomputing-datamodeling-activity-6629805853799182337-Pc_K</ext-link> (Accessed April 28, 2021).</citation></ref>
<ref id="ref24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lee</surname> <given-names>E. A. L.</given-names></name> <name><surname>Wong</surname> <given-names>K. W.</given-names></name> <name><surname>Fung</surname> <given-names>C. C.</given-names></name></person-group> (<year>2010</year>). <article-title>How does desktop virtual reality enhance learning outcomes? A structural equation modeling approach</article-title>. <source>Comput. Educ.</source> <volume>55</volume>, <fpage>1424</fpage>&#x2013;<lpage>1442</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.compedu.2010.06.006</pub-id></citation></ref>
<ref id="ref25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Makransky</surname> <given-names>G.</given-names></name> <name><surname>Borre-Gude</surname> <given-names>S.</given-names></name> <name><surname>Mayer</surname> <given-names>R. E.</given-names></name></person-group> (<year>2019</year>). <article-title>Motivational and cognitive benefits of training in immersive virtual reality based on multiple assessments</article-title>. <source>J. Comput. Assist. Learn.</source> <volume>35</volume>, <fpage>691</fpage>&#x2013;<lpage>707</lpage>. doi: <pub-id pub-id-type="doi">10.1111/jcal.12375</pub-id></citation></ref>
<ref id="ref26"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Meiguins</surname> <given-names>B. S.</given-names></name> <name><surname>do Carmo</surname> <given-names>R. M. C.</given-names></name> <name><surname>Almeida</surname> <given-names>L.</given-names></name> <name><surname>Gon&#x00E7;alves</surname> <given-names>A. S.</given-names></name> <name><surname>Pinheiro</surname> <given-names>S. C. V.</given-names></name> <name><surname>de Brito Garcia</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>2006</year>). &#x201C;Multidimensional information visualization using augmented reality.&#x201D; in <italic>Proceedings of the 2006 ACM international conference on Virtual Reality continuum and its applications</italic>; June 4&#x2013;17 2006; 391&#x2013;394.</citation></ref>
<ref id="ref27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mondal</surname> <given-names>M. R. H.</given-names></name> <name><surname>Bharati</surname> <given-names>S.</given-names></name> <name><surname>Podder</surname> <given-names>P.</given-names></name> <name><surname>Podder</surname> <given-names>P.</given-names></name></person-group> (<year>2020</year>). <article-title>Data analytics for novel coronavirus disease</article-title>. <source>Inform. Med. Unlocked</source> <volume>20</volume>:<fpage>100374</fpage>. doi: <pub-id pub-id-type="doi">10.1016/j.imu.2020.100374</pub-id>, PMID: <pub-id pub-id-type="pmid">32835073</pub-id></citation></ref>
<ref id="ref28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Olshannikova</surname> <given-names>E.</given-names></name> <name><surname>Ometov</surname> <given-names>A.</given-names></name> <name><surname>Koucheryavy</surname> <given-names>Y.</given-names></name> <name><surname>Olsson</surname> <given-names>T.</given-names></name></person-group> (<year>2015</year>). <article-title>Visualizing Big Data with augmented and virtual reality: challenges and research agenda</article-title>. <source>J. Big Data</source> <volume>2</volume>:<fpage>22</fpage>. doi: <pub-id pub-id-type="doi">10.1186/s40537-015-0031-2</pub-id></citation></ref>
<ref id="ref29"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Pennefather</surname> <given-names>P.</given-names></name> <name><surname>Krebs</surname> <given-names>C.</given-names></name></person-group> (<year>2019</year>). &#x201C;<article-title>Exploring the role of xR in visualisations for use in medical education</article-title>&#x201D; in <source>Biomedical Visualisation.</source> ed. <person-group person-group-type="editor"><name><surname>Rea</surname> <given-names>P. M.</given-names></name></person-group> (<publisher-loc>Cham</publisher-loc>: <publisher-name>Springer</publisher-name>), <fpage>15</fpage>&#x2013;<lpage>23</lpage>.</citation></ref>
<ref id="ref30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Romano</surname> <given-names>A.</given-names></name> <name><surname>Sotis</surname> <given-names>C.</given-names></name> <name><surname>Dominioni</surname> <given-names>G.</given-names></name> <name><surname>Guidi</surname> <given-names>S.</given-names></name></person-group> (<year>2020</year>). <article-title>The scale of COVID-19 graphs affects understanding, attitudes, and policy preferences</article-title>. <source>Health Econ.</source> <volume>29</volume>, <fpage>1482</fpage>&#x2013;<lpage>1494</lpage>. doi: <pub-id pub-id-type="doi">10.1002/hec.4143</pub-id>, PMID: <pub-id pub-id-type="pmid">32844495</pub-id></citation></ref>
<ref id="ref31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schall</surname> <given-names>G.</given-names></name> <name><surname>Mendez</surname> <given-names>E.</given-names></name> <name><surname>Kruijff</surname> <given-names>E.</given-names></name> <name><surname>Veas</surname> <given-names>E.</given-names></name> <name><surname>Junghanns</surname> <given-names>S.</given-names></name> <name><surname>Reitinger</surname> <given-names>B.</given-names></name> <etal/></person-group>. (<year>2009</year>). <article-title>Handheld augmented reality for underground infrastructure visualization</article-title>. <source>Pers. Ubiquit. Comput.</source> <volume>13</volume>, <fpage>281</fpage>&#x2013;<lpage>291</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s00779-008-0204-5</pub-id></citation></ref>
<ref id="ref32"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Simon</surname> <given-names>H. A.</given-names></name></person-group> (<year>1996</year>). <source>The Sciences of the Artificial</source>. <edition>3rd</edition> Edn. <publisher-loc>Cambridge, MA</publisher-loc>: <publisher-name>MIT Press</publisher-name>.</citation></ref>
<ref id="ref33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Slater</surname> <given-names>M.</given-names></name></person-group> (<year>2003</year>). <article-title>A note on presence terminology</article-title>. <source>Presence Connect</source> <volume>3</volume>, <fpage>1</fpage>&#x2013;<lpage>5</lpage>.</citation></ref>
<ref id="ref34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Teb&#x00E9;</surname> <given-names>C.</given-names></name> <name><surname>Valls</surname> <given-names>J.</given-names></name> <name><surname>Satorra</surname> <given-names>P.</given-names></name> <name><surname>Tob&#x00ED;as</surname> <given-names>A.</given-names></name></person-group> (<year>2020</year>). <article-title>COVID19-world: a shiny application to perform comprehensive country-specific data visualization for SARS-CoV-2 epidemic</article-title>. <source>BMC Med. Res. Methodol.</source> <volume>20</volume>:<fpage>235</fpage>. doi: <pub-id pub-id-type="doi">10.1186/s12874-020-01121-9</pub-id>, PMID: <pub-id pub-id-type="pmid">32958001</pub-id></citation></ref>
<ref id="ref35"><citation citation-type="other"><person-group person-group-type="author"><collab id="coll4">Texas Medical Center</collab></person-group> (<year>2020</year>). 2020 State of the Texas Medical Center [Video]. YouTube. Available at: <ext-link xlink:href="https://youtu.be/PrJM9aXeCgE?t=1831" ext-link-type="uri">https://youtu.be/PrJM9aXeCgE?t=1831</ext-link> (Accessed December 11, 2020).</citation></ref>
<ref id="ref36"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Thomas</surname> <given-names>B. H.</given-names></name> <name><surname>Marner</surname> <given-names>M.</given-names></name> <name><surname>Smith</surname> <given-names>R. T.</given-names></name> <name><surname>Elsayed</surname> <given-names>N. A. M.</given-names></name> <name><surname>Von Itzstein</surname> <given-names>S.</given-names></name> <name><surname>Klein</surname> <given-names>K.</given-names></name> <etal/></person-group>. (<year>2014</year>). &#x201C;Spatial augmented reality&#x2014;a tool for 3D data visualization.&#x201D; in <italic>2014 IEEE VIS International Workshop on 3DVis (3DVis)</italic>. IEEE; November 9, 2006; 45&#x2013;50.</citation></ref>
<ref id="ref37"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Wu</surname> <given-names>R.</given-names></name></person-group> (<year>2020</year>). &#x201C;TikTok data guy&#x201D; Educates through dazzling visualizations. Available at: <ext-link xlink:href="https://spectrumlocalnews.com/nc/charlotte/news/2020/12/22/tik-tok-guy" ext-link-type="uri">https://spectrumlocalnews.com/nc/charlotte/news/2020/12/22/tik-tok-guy</ext-link> (Accessed December 23, 2020).</citation></ref>
<ref id="ref38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wu</surname> <given-names>B.</given-names></name> <name><surname>Yu</surname> <given-names>X.</given-names></name> <name><surname>Gu</surname> <given-names>X.</given-names></name></person-group> (<year>2020</year>). <article-title>Effectiveness of immersive virtual reality using head-mounted displays on learning performance: a meta-analysis</article-title>. <source>Br. J. Educ. Technol.</source> <volume>51</volume>, <fpage>1991</fpage>&#x2013;<lpage>2005</lpage>. doi: <pub-id pub-id-type="doi">10.1111/bjet.13023</pub-id></citation></ref>
<ref id="ref39"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Zuo</surname> <given-names>F.</given-names></name> <name><surname>Wang</surname> <given-names>J.</given-names></name> <name><surname>Gao</surname> <given-names>J.</given-names></name> <name><surname>Ozbay</surname> <given-names>K.</given-names></name> <name><surname>Ban</surname> <given-names>X. J.</given-names></name> <name><surname>Shen</surname> <given-names>Y.</given-names></name> <etal/></person-group>. (<year>2020</year>). An interactive data visualization and analytics tool to evaluate mobility and sociability trends during Covid-19. arXiv [Preprint]. doi: <pub-id pub-id-type="doi">10.13140/RG.2.2.32750.02881</pub-id></citation></ref></ref-list><fn-group><fn id="fn0001"><p><sup>1</sup><italic>Flow</italic> also has focused on helping enterprises better communicate with data and has been used by organizations such as the UNDP, The World Bank, Deloitte, and BlackRock.</p></fn><fn id="fn0002"><p><sup>2</sup><ext-link xlink:href="https://a.Flow.gl/#/Flow/5zsc0ky" ext-link-type="uri">https://a.Flow.gl/#/Flow/5zsc0ky</ext-link></p></fn><fn id="fn0003"><p><sup>3</sup>Example situations we are aware of include institutions and news channels [e.g., internally at Institute for Health Metrics and Evaluation (IHME) and at Spectrum News 1 in North Carolina; <xref ref-type="bibr" rid="ref37">Wu, 2020</xref>].</p></fn></fn-group></back></article>