<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article article-type="brief-report" dtd-version="2.3" xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Sens.</journal-id>
<journal-title>Frontiers in Sensors</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Sens.</abbrev-journal-title>
<issn pub-type="epub">2673-5067</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">752754</article-id>
<article-id pub-id-type="doi">10.3389/fsens.2021.752754</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Sensors</subject>
<subj-group>
<subject>Perspective</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Sensing and Biosensing in the World of Autonomous Machines and Intelligent Systems</article-title>
<alt-title alt-title-type="left-running-head">Oliveira and Oliveira</alt-title>
<alt-title alt-title-type="right-running-head">Sensors in Intelligent Systems</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Oliveira</surname>
<given-names>Osvaldo N.</given-names>
<suffix>Jr</suffix>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="corresp" rid="c001">&#x2a;</xref>
<uri xlink:href="https://loop.frontiersin.org/people/298462/overview"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Oliveira</surname>
<given-names>Maria Cristina F.</given-names>
</name>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
</contrib>
</contrib-group>
<aff id="aff1">
<label>
<sup>1</sup>
</label>S&#xe3;o Carlos Institute of Physics, University of Sao Paulo (USP), <addr-line>Sao Carlos</addr-line>, <country>Brazil</country>
</aff>
<aff id="aff2">
<label>
<sup>2</sup>
</label>Institute of Mathematical Sciences and Computing, University of Sao Paulo, <addr-line>Sao Carlos</addr-line>, <country>Brazil</country>
</aff>
<author-notes>
<fn fn-type="edited-by">
<p>
<bold>Edited by:</bold> <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/89244/overview">Dermot Diamond</ext-link>, Dublin City University, Ireland</p>
</fn>
<fn fn-type="edited-by">
<p>
<bold>Reviewed by:</bold> <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/242739/overview">Stanislav Moshkalev</ext-link>, State University of Campinas, Brazil</p>
<p>
<ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/935050/overview">Juyoung Leem</ext-link>, Stanford University, United&#x20;States</p>
</fn>
<corresp id="c001">&#x2a;Correspondence: Osvaldo N. Oliveira Jr, <email>chu@ifsc.usp.br</email>
</corresp>
<fn fn-type="other">
<p>This article was submitted to Sensor Devices, a section of the journal Frontiers in Sensors</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>17</day>
<month>09</month>
<year>2021</year>
</pub-date>
<pub-date pub-type="collection">
<year>2021</year>
</pub-date>
<volume>2</volume>
<elocation-id>752754</elocation-id>
<history>
<date date-type="received">
<day>03</day>
<month>08</month>
<year>2021</year>
</date>
<date date-type="accepted">
<day>06</day>
<month>09</month>
<year>2021</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#xa9; 2021 Oliveira and Oliveira.</copyright-statement>
<copyright-year>2021</copyright-year>
<copyright-holder>Oliveira and Oliveira</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these&#x20;terms.</p>
</license>
</permissions>
<abstract>
<p>In this paper we discuss how nanotech-based sensors and biosensors are providing the data for autonomous machines and intelligent systems, using two metaphors to exemplify the convergence between nanotechnology and artificial intelligence (AI). These are related to sensors to mimic the five human senses, and integration of data from varied sources and natures into an intelligent system to manage autonomous services, as in a train station.</p>
</abstract>
<kwd-group>
<kwd>sensors</kwd>
<kwd>biosensors</kwd>
<kwd>artificial intelligence</kwd>
<kwd>machine learning</kwd>
<kwd>ELECTRONIC TONGUE</kwd>
<kwd>electronic skin</kwd>
<kwd>visual analytics</kwd>
</kwd-group>
<contract-sponsor id="cn001">Funda&#xe7;&#xe3;o de Amparo &#xe0; Pesquisa Do Estado de S&#xe3;o Paulo<named-content content-type="fundref-id">10.13039/501100001807</named-content>
</contract-sponsor>
<contract-sponsor id="cn002">Conselho Nacional de Desenvolvimento Cient&#xed;fico e Tecnol&#xf3;gico<named-content content-type="fundref-id">10.13039/501100003593</named-content>
</contract-sponsor>
</article-meta>
</front>
<body>
<sec id="s1">
<title>Introduction</title>
<p>The rapid progress in autonomous systems with artificial intelligence (AI) has brought an expectation that machines and software systems will soon be able to perform intellectual tasks as efficiently as humans (perhaps even better), to the extent that in a near future, for the first time in history such systems may be generating knowledge, with no human intervention (<xref ref-type="bibr" rid="B31">Rodrigues et&#x20;al., 2021</xref>). This tremendous achievement will only be realized if these manmade systems can acquire, process and make sense of a lot of combined data from the environment, in addition to mastering natural languages. The latter requirement appears particularly challenging since machine learning (ML) and other currently successful AI approaches are not yet sufficient to interpret text (<xref ref-type="bibr" rid="B31">Rodrigues et&#x20;al., 2021</xref>). Another stringent requirement is in the capability of continuously acquiring information with sensors and biosensors, in many cases having to emulate human capabilities. As for shorter-term applications, medical diagnosis and any other type of diagnosis are among the topics that may benefit most from AI. This is due to a convergence with multiple technologies that are crucial for diagnosis, namely the nanotech-based methodologies which allow for ubiquitous sensing and biosensing to be integrated into diagnosis and surveillance systems (<xref ref-type="bibr" rid="B32">Rodrigues et&#x20;al., 2016</xref>). Diagnosis is essentially a classification task, for which ML has been proven especially suited, in spite of its limitations in performing tasks that require interpretation (<xref ref-type="bibr" rid="B4">Buscaglia et&#x20;al., 2021</xref>).</p>
<p>Living beings depend on sensing for their survival, growth, reproduction, and interaction. Humans, in particular, use their five senses (touch, sight, hearing, smell and taste) at all times to monitor their environment and interact with it, where most of the sensorial detection relies on pattern recognition. Through history, a range of devices and instruments have been developed to augment and assist human monitoring capability, but the field of sensing (and biosensing) became well established only in the final decades of the 20th century. By way of illustration, in a search in the Web of Science in May, 2021, for published papers containing these terms only a few dozen papers per year were found for the period from 1900 to 1950, most of which unrelated to sensor devices, but rather associated with sensorial phenomena. In the 1970s, an order of magnitude of 1,000 papers published per year was reached, and in the 1990s the annual numbers increased to 10,000 and 20,000, whereas in the last 2&#xa0;years the number of papers published ranged between 140,000 and 150,000 per year. This outstanding increase in scientific production was obviously a consequence of the progress in research on novel materials, including manmade as well as natural materials adapted for sensing purposes. Though Nature provided inspiration from the early stages, particularly with sensing in animals, two reasons contributed to the field expanding as if entirely independent of human (or animal) sensing. First, real-time monitoring was unfeasible in many sensing applications, and integration of different types of sensors&#x2014;e.g., to emulate the five senses&#x2014;remained mostly elusive. Another significant distinction lies in the underlying detection principles: unlike the natural sensors in living beings, normally the analysis of data from manmade sensors does not rely on pattern recognition methods.</p>
<p>The significant progresses in analytical techniques and in sensors and biosensors as nanotech products introduce opportunities for bridging these gaps. Cheap sensors are now routinely fabricated that permit ubiquitous sensing and real-time monitoring in specific applications. Some of these sensors can be wearable and even implantable, and may be more sensitive than their corresponding sensors in humans. An archetypical example is that of an electronic tongue (e-tongue), whose sensitivity for some tastes can be as high as 10,000&#x20;times the average sensitivity value for humans (<xref ref-type="bibr" rid="B30">Riul Jr. et&#x20;al., 2010</xref>). Sensing and biosensing employ distinct detection principles, which permits integrating different types of data (e.g., obtained with electronic tongues and electronic noses). Furthermore, computational methods are now available to process the large amounts of data generated with ubiquitous sensing and real-time monitoring, with the bonus of a substantially expanded capacity in terms of memory and processing power, compared to living beings. These methods also allow exploiting pattern recognition strategies, thus bringing the sensing tasks somewhat closer to how sensing is performed by humans. Last, but not least, sensing principles are much broader than those prevailing in living beings, which may open novel ways to fabricate intelligent systems and robots with unprecedented capabilities.</p>
<p>In this paper we focus on two aspects: 1) sensing systems that mimic human senses to exemplify how manmade sensors are being developed with bioinspiration; 2) sensors and biosensors aimed at integration into intelligent systems, in which a discussion will be presented of the stringent requirements that must be fulfilled for data analysis based on computational methods, especially machine learning.</p>
</sec>
<sec id="s1-1">
<title>Mimicking the Five Senses</title>
<p>Today&#x2019;s technology allows for mimicking the five human senses (<xref ref-type="bibr" rid="B8">Guerrini et&#x20;al., 2017</xref>) with the multiple methodologies discussed below and briefly illustrated with one or two examples, as it is not our purpose to review the major contributions in any of these areas. We shall also distinguish between sensing systems conceived as artificial counterparts of human organs and those simply performing similar functions, even if their shape and nature have nothing to do with the sensing organs.</p>
<sec id="s1-2">
<title>Touch</title>
<p>Pressure and strain sensors have been developed toward creating electronic skins (<xref ref-type="bibr" rid="B15">Lipomi et&#x20;al., 2011</xref>; <xref ref-type="bibr" rid="B9">Hammock et&#x20;al., 2013</xref>), ionic skins (<xref ref-type="bibr" rid="B26">Qiu et&#x20;al., 2021</xref>) or epidermal electronics (<xref ref-type="bibr" rid="B13">Kim et&#x20;al., 2011</xref>). The terms used may vary and so do the functions performed by e-skins or ionic skins, which can go well beyond those of a human (or other animals) skin. The e-skins share nevertheless the following features: they should be flexible, stretchable, self-healing, and possess the ability to sense temperature, and wide ranges of pressures (not only touch) and strain. With recent developments in nanomaterials and self-healable polymers, it has been possible to obtain e-skins with augmented performance in comparison to their organic counterparts, especially in superior spatial resolution and thermal sensitivity (<xref ref-type="bibr" rid="B9">Hammock et&#x20;al., 2013</xref>). Future improvements are focused on adding functionalities for specific purposes. For health applications, for instance, biosensors may be incorporated to monitor body conditions and detect diseases, while antimicrobial coatings may be employed to functionalize the e-skins to assist in wound healing (<xref ref-type="bibr" rid="B39">Yang et&#x20;al., 2019</xref>). Within the paradigm of epidermal electronics, on the other hand, the systems envisaged may include not only sensors, but also transistors, capacitors, light-emitting diodes, photovoltaic devices and wireless coils (<xref ref-type="bibr" rid="B13">Kim et&#x20;al., 2011</xref>). Self-powered tactile sensors produced with piezoelectric polymer nanofibers are indicative of these capabilities (<xref ref-type="bibr" rid="B16">Liu et&#x20;al., 2021a</xref>). An example of the sensing ability of self-healing hydrogels is in detecting distinct body movements, including from speaking which can be relevant for speech processing in the future (<xref ref-type="bibr" rid="B17">Liu et&#x20;al., 2021b</xref>). We shall return to this point when discussing the hearing&#x20;sense.</p>
</sec>
<sec id="s1-3">
<title>Taste</title>
<p>Taste sensing has been mimicked for decades with electronic tongues (e-tongues), which normally contain an array of sensing units with principles of detection based mostly on electrochemical methods (<xref ref-type="bibr" rid="B38">Winquist, 2008</xref>) and impedance spectroscopy (<xref ref-type="bibr" rid="B29">Riul et&#x20;al., 2002</xref>). The rationale behind an e-tongue is that humans perceive taste as a combination of five basic tastes, viz. sweet, salty, sour, bitter and umami, thus meaning that the brain receives from the sensors in the papillae signals that are not specific to any given chemical compound. This is the so-called global selectivity principle (<xref ref-type="bibr" rid="B29">Riul et&#x20;al., 2002</xref>), according to which the sensing units, in contrast to biosensors, do not need to contain materials with specific interactions with the samples. Obviously, obtaining sensing units with high sensitivity require their manufacturing materials to be judiciously chosen, bearing in mind the intended application. For example, if an e-tongue is to be used to distinguish liquids with varied acidity levels, polyanilines can be selected for the sensing units since their electrical properties are very sensitive to the pH. In practice, an e-tongue typically comprises four to six sensing units made of nanomaterials or nanostructured polymer films, where the distinct sensing units are expected to yield different responses for a given liquid. This variability is important to establish a &#x201c;finger print&#x201d; for the liquids under analysis, which may have similar properties. An e-tongue may take different shapes. While the majority comprise sensing units with nanostructured films deposited over areas of the order of cm<sup>2</sup>, microfluidic e-tongues have also been produced (<xref ref-type="bibr" rid="B35">Shimizu et&#x20;al., 2017</xref>). This is an advantageous arrangement because it requires small amounts of samples for the measurements and allows for multiplex sensing in miniaturized setups.</p>
<p>Though conceived to mimic the tasting function, e-tongues may also be employed in several tasks unrelated to taste. Hence, in addition to their use in evaluating taste in wines, juices, coffee (<xref ref-type="bibr" rid="B30">Riul Jr et&#x20;al., 2010</xref>), taste masking in pharmaceutical drugs (<xref ref-type="bibr" rid="B18">Machado et&#x20;al., 2018</xref>), e-tongues have been utilized in detecting poisoning and pollution in waters, fuel adulteration and soil analysis (<xref ref-type="bibr" rid="B3">Braunger et&#x20;al., 2017</xref>). Three other aspects are worth mentioning about e-tongues. The first is related to the incorporation of biosensors as one (or more) of the sensing units in the arrays. The overall selectivity can be enhanced in these so-called bioelectronic tongues, as demonstrated for the discrimination of two similar tropical diseases (<xref ref-type="bibr" rid="B25">Perinoto et&#x20;al., 2010</xref>). A second aspect refers to data analysis since the use of a global selectivity concept requires assessing the combined responses of various sensing units. As a consequence, a considerable amount of data configurations is generated which must be analyzed with statistical and computational methods. Reduction in the dimensionality of the data representation is thus a central operation. The methods often applied to e-tongue data include principal component analysis (PCA) (<xref ref-type="bibr" rid="B11">Jolliffe and Cadima, 2016</xref>) and interactive document mapping (IDMAP) (<xref ref-type="bibr" rid="B19">Minghim et&#x20;al., 2006</xref>). With these methods, the response measured for one sample&#x2014;e.g., one impedance spectrum&#x2014;is mapped as a graphical marker, and markers are spatialized so that those markers depicting samples with similar responses will be placed close to each other. Hence, one may identify visual clusters of similar elements, which would suggest a correct classification of the samples is possible in case one univocal cluster exists for each class. If the number of samples is too large, the visualization of clusters on a map is not efficient due to overlapping of markers and clusters. The data may still be processed with machine learning algorithms (<xref ref-type="bibr" rid="B22">Neto et&#x20;al., 2021</xref>), which can be either supervised or unsupervised. In supervised learning, it may be also possible to correlate the e-tongue response with human taste (<xref ref-type="bibr" rid="B6">Ferreira et&#x20;al., 2007</xref>). The third aspect is associated with the combination of e-tongues and electronic noses (e-noses, described below). Especially for drinks and beverages such as coffee and wine, flavor perception depends on taste and smell combined, and therefore it is advisable to employ e-tongues in conjunction with e-noses (<xref ref-type="bibr" rid="B33">Rodriguez-Mendez et&#x20;al., 2014</xref>).</p>
</sec>
<sec id="s1-4">
<title>Smell</title>
<p>Electronic noses (e-noses) are the counterparts of e-tongues for smell, being also based on global selectivity concepts where arrays of vapour-sensing devices are employed to mimic the mammalian olfactory system (<xref ref-type="bibr" rid="B27">Rakow and Suslick, 2000</xref>). Similar to e-tongues, varied principles of detection can be exploited, including electrical, electrochemical, and optical measurements, or any type of measurement used in gas sensors. The materials for building the sensing units are selected to allow for interaction with various types of vapours, as with metalloporphyrin dyes whose optical properties are affected significantly by ligating vapours such as alcohols, amines, ethers, phosphines and thiols (<xref ref-type="bibr" rid="B27">Rakow and Suslick, 2000</xref>). The e-noses are mostly obtained with nanostructured films, e.g., Langmuir-Blodgett (LB) (<xref ref-type="bibr" rid="B2">Barker et&#x20;al., 1994</xref>) and, as in biosensor, may include sensing units capable of specific interaction with analytes. An example of the latter was an e-nose with field-effect transistors made with carbon nanotubes functionalized with lipid nanodiscs containing insect odorant receptors (<xref ref-type="bibr" rid="B20">Murugathas et&#x20;al., 2019</xref>). Their selective electrical response to the corresponding ligands for the odorant receptors allowed for distinguishing the smells from fresh and rotten fish (<xref ref-type="bibr" rid="B20">Murugathas et&#x20;al., 2019</xref>).</p>
</sec>
<sec id="s1-5">
<title>Sight</title>
<p>Spectacular developments have been witnessed in computational vision owing to the widespread deployment of all sorts of cameras. Though one may argue that the availability of high-quality cameras is far from sufficient for mimicking the sight sense, recent breakthroughs in the field of computational vision demonstrated that artificial systems can already equal, or even surpass the human ability in executing certain image or video analysis tasks (<xref ref-type="bibr" rid="B12">Ng et&#x20;al., 2018</xref>). Facial recognition, for example, can certainly be performed with superior performance by intelligent systems employing deep learning strategies (<xref ref-type="bibr" rid="B21">NandhiniAbirami et&#x20;al., 2021</xref>). Also, cameras may be used to detect and observe phenomena beyond human capabilities, as in the case of infrared vision (<xref ref-type="bibr" rid="B10">Havens and Sharp, 2016</xref>). The challenge of replicating the functionality of the human eye in a single device is, nevertheless, still formidable, especially if prosthetic eyes are desired, since a fully functional analogue of the eye remains a long-term goal (<xref ref-type="bibr" rid="B28">Regal et&#x20;al., 2021</xref>). Research on novel materials for bionic eyes and special cameras focuses essentially on bioinspired and biointegrated electronics to fabricate deformable and self-healable devices that preserve functionality while being deformed (<xref ref-type="bibr" rid="B14">Lee et&#x20;al., 2021</xref>). This is analog to other types of devices to mimic human senses, e.g. electronic skin and electronic ear (see below). Another feature shared by these mimicking systems is the need to process large amounts of data, in many cases of entirely different natures. A recent example is represented by an evaluation of withering of tea leaves upon combining data from near-infrared spectroscopy, an electronic eye, and a colorimetric sensing array, which were treated with machine learning algorithms in a machine vision system (<xref ref-type="bibr" rid="B37">Wang et&#x20;al., 2021</xref>).</p>
</sec>
<sec id="s1-6">
<title>Hearing</title>
<p>The hearing ability may be mimicked with the so-called electronic ears (<xref ref-type="bibr" rid="B36">Solanki et&#x20;al., 2017</xref>) and other devices, which basically work with strain or pressure sensors. The simplest ones are chemiresistive sensors containing nanomaterials, whose response varies with the mechanical stimulus with sufficient sensitivity to detect whistling, breathing and speaking (<xref ref-type="bibr" rid="B36">Solanki et&#x20;al., 2017</xref>). In terms of materials for sensing, as already mentioned they are similar to those employed for the touch sensors and electronic skins. One issue yet not exploited to a reasonable extent is speech processing. Current applications involving speech processing&#x2014;prevailing in intelligent assistants, for example&#x2014;use high-quality microphones to acquire sound. The new developments in wearable strain and pressure sensors allow us to envisage sound acquisition directly from the human (or other living being) body. This would revolutionize the working principles of speech processing, especially in the biomedical area as it would enable real-time online monitoring. Obviously, such an intrusive data collection approach raises ethical issues, for any type of user utterance would be captured and recorded. Another thread is represented by innovative applications made possible by the wide availability of low-cost autonomous microphones, such as the study of environmental soundscapes, for purposes that go from monitoring biodiversity in natural environments (bioacoustics monitoring) (<xref ref-type="bibr" rid="B7">Gibb et&#x20;al., 2019</xref>) or the ocean fauna (<xref ref-type="bibr" rid="B34">S&#xe1;nchez-Gendriz and Padovese, 2017</xref>).</p>
</sec>
</sec>
<sec id="s1-7">
<title>Integration of Sensing and Biosensing Into Intelligent Systems</title>
<p>Some of the examples associated with the five human senses are already representative of integrated systems based on artificial intelligence. It is relevant, however, that well beyond these examples many other types of sensors and biosensors exist which allow for monitoring substances, phenomena, and processes. In spite of the advances in integrating sensors and biosensors, as mentioned here, we should stress that real-time monitoring and seamless integration of multiple sensing devices using different technologies are still in an embryonic stage, as discussed next with a hypothetical scenario of an autonomous transportation station in a metropolitan&#x20;area.</p>
<sec id="s1-8">
<title>Autonomous Public Spaces and Infrastructures</title>
<p>Intelligent systems supported by sensing and biosensing are likely to be employed in any type of application involving control and actuation. Particularly challenging will be such integration in large infrastructures, for instance in public spaces and combining multiple initiatives. As an illustration, let us consider a station serving a busy town area, integrating train and bus services, plus a parking lot, as illustrated in <xref ref-type="fig" rid="F1">Figure&#x20;1</xref>. Let us imagine it is connected to a large green park area close to a river, with plenty of vegetation, pedestrian and cycling lanes. City administrators want this area to be a safe zone for users of the station and park. The station should be environmentally sustainable, and they also plan to partner with other city managers to run a preventive health care program targeted at the population of users and with researchers from the local university to study and preserve the fauna in the park. One may think of the sensing devices deployed in such a scenario. There will be video cameras for real time monitoring for purposes of adjusting train timetables according to the population flow and also of critical areas and spots for security, water and air quality sensors in the station and in the park and surroundings. The site may include a photovoltaic plant, for which energy generation by the plant and energy consumption both in the station and the park areas are continuously tracked and adjusted as necessary. Light-sensitive sensors can switch illumination on/off; sound sensors at multiple spots in the park monitor animal diversity and how the operation of the busy station affects their behavior. Station users are encouraged to stop and collect fundamental health indicators such as blood pressure, glycemic and cholesterol levels, and oriented towards medical assistance when issues are identified; the program may keep track of and approach those users for whom critical issues have been identified.</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption>
<p>A busy urban transportation station and the multiple sensors supporting its automated operation integrated with multiple initiatives of public interest, that continuously generate data of diverse&#x20;types.</p>
</caption>
<graphic xlink:href="fsens-02-752754-g001.tif"/>
</fig>
<p>A hypothetical integrated AI system to manage the station autonomously is depicted in the flow chart in <xref ref-type="fig" rid="F2">Figure&#x20;2</xref>, which brings an oversimplified abstract view of an intelligent data processing approach. The top layer represents the data sources, including the multitude of sensing devices to monitor internal and external risks in the facilities. These sensors will be continuously generating data of a variety of types, e.g., images, audios, measurements, text forms and documents. It must handle multiple data types, as the data is produced by sensing devices that will include those related to the human senses, i.e.,&#x20;part of the sensing will be sight, touch, taste, smell, and hearing. The diverse data types will demand treatment (storage, filtering, processing) and curation, as indicated the middle block in the figure. Furthermore, some processes will be required to verify if the data makes sense, whether the datasets have sufficient quality as input information for the AI system, represented as a single block for simplicity&#x2014;though more likely it would consist of multiple integrated systems. Finally, the system will need to learn data representations for algorithmic processing. This AI system will be responsible for the analysis (what happened, where, which is the danger or threat level), then a corresponding action&#x2014;in some cases in real time. Analysis tasks essentially consist of looking for specific patterns in the data indicating the occurrence of an anomalous situation, or a particular category of event, which in turn must trigger the corresponding actions from the different systems represented in the bottom of the figure. Such a complex scenario consists essentially of a combination of devices for monitoring (quantities and processes) and responding (reacting) to the measurements. The nature of the application may change entirely, e.g., we could think of diagnosis in a medical care facility, or the integrated operating room of a smart city, but the core components of such systems are essentially the same. The implication is that sensing must be ubiquitous and is bound to generate huge quantities of data, which can be connected to the Internet, thus enabling tasks and services to be executed and controlled remotely (<xref ref-type="bibr" rid="B1">Alzahrani, 2017</xref>).</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption>
<p>Schematic representation of the automated system to manage the train station, which should contain various layers of devices and control systems. The top layer represents the varied data sources, whose data will be acquired and curated in dedicated repositories. These will feed the AI system responsible for processing the data and providing the input for the various applications (surveillance, energy management, etc). The AI module and these applications will comprise a number of independent analysis and control systems.</p>
</caption>
<graphic xlink:href="fsens-02-752754-g002.tif"/>
</fig>
</sec>
<sec id="s1-9">
<title>Existing AI Technology and Major Challenges</title>
<p>The proposal outlined in <xref ref-type="fig" rid="F2">Figure&#x20;2</xref>, of a sensor enabled integrated AI system with multiple controls over its environment, may seem a far-fetched view of AI applications today. Yet, a careful analysis of its components indicates that existing technologies are already sufficient for implementing most of the tasks. Indeed, the literature is rich with examples of applications representative of all the components in the figure. Automation has been observed in a multitude of tasks for which the input is digital data, such as object detection in images, or natural language translation, or traffic control, with performance levels equivalent or superior to that of human operators. Smart integration of the required components is, however, a mammoth endeavor. For example, the management and curation of the data from such disparate natures and formats (i.e.,&#x20;images, text, audio, videos, sensing data) is a tremendous challenge. The architecture of the AI system will be highly complex as it must be prepared to detecting and handling multiple ordinary and anomalous situations timely. Even more relevant is that such a system, despite its complexity, is limited in that only classification tasks will be performed efficiently, as already mentioned. Nonetheless, full autonomous operation demands other relevant tasks, such as assessing risks and making decisions based on such assessments. These latter tasks will require at least some degree of interpretation, and therefore current technologies are still not sufficient for the autonomous operation envisaged.</p>
<p>The various issues involved in dealing with big data and machine learning for applications such as the autonomous station have been discussed in reviews and opinion papers (<xref ref-type="bibr" rid="B23">Oliveira et&#x20;al., 2014</xref>; <xref ref-type="bibr" rid="B32">Rodrigues et&#x20;al., 2016</xref>; <xref ref-type="bibr" rid="B24">Paulovich et&#x20;al., 2018</xref>; <xref ref-type="bibr" rid="B31">Rodrigues et&#x20;al., 2021</xref>). We observe a synergistic movement driven by the combination of data generation at unprecedented levels of detail, variety and velocity with massive computing capability. This is crucial to introduce some kind of &#x201c;intelligence&#x201d; into systems that process data to execute complex tasks. For instance, data now plays an &#x201c;active&#x201d; role in science discovery, meaning that rather than solely supporting hypothesis verification, data collected at massive scales with ubiquitous and networked sensing devices connected to &#x201c;things&#x201d; (the &#x201c;Internet of Things&#x201d;) can support &#x201c;intelligent&#x201d; automation of complex tasks and foster active search for hidden hypotheses.</p>
<p>It must be stressed that building complex autonomous systems as the hypothetical case of the transportation station will demand considerable human effort. Human experts need to be prepared to inspect a system&#xb4;s underlying algorithms and supervise task execution and decision making during development and operation, in order to ensure it complies with the intended goals. This is different than just inspecting results yielded by a standalone algorithm on a relatively small data set. ML algorithms can be trained to identify patterns from data, but data quality is a fundamental issue (otherwise, &#x201c;garbage-in, garbage-out&#x201d;). Moreover, ensuring quality and correctness of the outcomes may demand considerable user supervision. The complexity posed by the sheer scale of data generation and processing, plus the need to handle distinct data types produced by multiple sources in an integrated manner, requires substantial changes in the role of the experts. It also modifies the type of expertise required. Help can be found from researchers working in the field known as &#x201c;visual analytics&#x201d; (<xref ref-type="bibr" rid="B5">Endert et&#x20;al., 2017</xref>), in which the goal is to study ways of introducing effective user interaction into data processing and model learning. An additional concern addressed with visual representations is to enhance model interpretability. Domain experts in charge of analysis, as well as decision makers, must be well informed on the concepts behind the techniques, understand their limitations and learn how to parametrize algorithms properly and how to interpret the results and the performance measures. Techniques for data and model visualization can contribute to empowering the mutual roles of a human expert and a ML algorithm in conducting analysis&#x20;tasks.</p>
</sec>
</sec>
<sec id="s2">
<title>Concluding Remarks</title>
<p>The integration of sensing and biosensing with machine learning algorithms to develop sophisticated autonomous systems has been explored here using two metaphors. In the first, we highlighted recent developments in nanotechnology to fabricate devices that can mimic the five human senses. The motivation behind this choice was not only because these nanotech devices constitute an essential requirement for autonomous entities, but also due to the data processing involved. The other metaphor was related to an autonomous transportation station where we illustrated how an integrated &#x201c;intelligent&#x201d; system could manage the station using the input data from a multitude of sensors. This latter example was intended as a mere illustration, for there is a virtually endless list of complex problems could be similarly tackled at a large scale with systems directly fed with data from sensors, with limited human intervention. These range from traffic control to population health and environmental monitoring, precision agriculture to manufacturing processes, amongst others. The wider dissemination of solutions based on machine learning from data provided by sensors is an inescapable trend. Their successful application depends less on technological issues and more on tackling important conceptual and practical ethical, legal and social issues regarding data collection and usage. Yet, some technological issues remain, such as seamless integration of data from different systems and ensuring continuously adaptive learning in highly dynamic environments.</p>
</sec>
</body>
<back>
<sec id="s3">
<title>Data Availability Statement</title>
<p>The original contributions presented in the study are included in the article/Supplementary Material, further inquiries can be directed to the corresponding author.</p>
</sec>
<sec id="s4">
<title>Author Contributions</title>
<p>All authors listed have made a substantial, direct, and intellectual contribution to the work and approved it for publication.</p>
</sec>
<sec id="s5">
<title>Funding</title>
<p>INEO, CAPES, CNPq (301847/2017&#x2013;7) and FAPESP (2018/22214&#x2013;6) (Brazil).</p>
</sec>
<sec sec-type="COI-statement" id="s6">
<title>Conflict of Interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
<p>The handling editor declared a past co-authorship/collaboration on a previous publication with the authors.</p>
</sec>
<sec sec-type="disclaimer" id="s7">
<title>Publisher&#x2019;s Note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors, and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Alzahrani</surname>
<given-names>S. M.</given-names>
</name>
</person-group> (<year>2017</year>).<article-title>Sensing for the Internet of Things and its Applications</article-title>, In <conf-name>5th International Conference on Future Internet of Things and Cloud Workshops</conf-name>. <publisher-loc>Prague</publisher-loc>: <publisher-name>FiCloudW</publisher-name>, <fpage>88</fpage>&#x2013;<lpage>92</lpage>. <pub-id pub-id-type="doi">10.1109/FiCloudW.2017.94</pub-id> </citation>
</ref>
<ref id="B2">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Barker</surname>
<given-names>P. S.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>J.&#x20;R.</given-names>
</name>
<name>
<surname>Agbor</surname>
<given-names>N. E.</given-names>
</name>
<name>
<surname>Monkman</surname>
<given-names>A. P.</given-names>
</name>
<name>
<surname>Mars</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Petty</surname>
<given-names>M. C.</given-names>
</name>
</person-group> (<year>1994</year>). <article-title>Vapour Recognition Using Organic Films and Artificial Neural Networks</article-title>. <source>Sensors Actuators B: Chem.</source> <volume>17</volume>, <fpage>143</fpage>&#x2013;<lpage>147</lpage>. <pub-id pub-id-type="doi">10.1016/0925-4005(94)87042-X</pub-id> </citation>
</ref>
<ref id="B3">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Braunger</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Shimizu</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Jimenez</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Amaral</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Piazzetta</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Gobbi</surname>
<given-names>&#xc2;.</given-names>
</name>
<etal/>
</person-group> (<year>2017</year>). <article-title>Microfluidic Electronic Tongue Applied to Soil Analysis</article-title>. <source>Chemosensors</source> <volume>5</volume>, <fpage>14</fpage>. <pub-id pub-id-type="doi">10.3390/chemosensors5020014</pub-id> </citation>
</ref>
<ref id="B4">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Buscaglia</surname>
<given-names>L. A.</given-names>
</name>
<name>
<surname>Oliveira</surname>
<given-names>O. N.</given-names>
<suffix>Jr</suffix>
</name>
<name>
<surname>Carmo</surname>
<given-names>J.&#x20;P.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Roadmap for Electrical Impedance Spectroscopy for Sensing: A Tutorial</article-title>. <source>IEEE Sensors J.</source>, <fpage>1</fpage>. <pub-id pub-id-type="doi">10.1109/JSEN.2021.3085237</pub-id> </citation>
</ref>
<ref id="B5">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Endert</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Ribarsky</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Turkay</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Wong</surname>
<given-names>B. L. W.</given-names>
</name>
<name>
<surname>Nabney</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Blanco</surname>
<given-names>I. D.</given-names>
</name>
<etal/>
</person-group> (<year>2017</year>). <article-title>The State of the Art in Integrating Machine Learning into Visual Analytics</article-title>. <source>Comp. Graphics Forum</source> <volume>36</volume> (<issue>8</issue>), <fpage>458</fpage>&#x2013;<lpage>486</lpage>. <pub-id pub-id-type="doi">10.1111/cgf.13092</pub-id> </citation>
</ref>
<ref id="B6">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ferreira</surname>
<given-names>E. J.</given-names>
</name>
<name>
<surname>Pereira</surname>
<given-names>R. C. T.</given-names>
</name>
<name>
<surname>Delbem</surname>
<given-names>A. C. B.</given-names>
</name>
<name>
<surname>Oliveira</surname>
<given-names>O. N.</given-names>
<suffix>Jr</suffix>
</name>
<name>
<surname>Mattoso</surname>
<given-names>L. H. C.</given-names>
</name>
</person-group> (<year>2007</year>). <article-title>Random Subspace Method for Analysing Coffee with Electronic Tongue</article-title>. <source>Electron. Lett.</source> <volume>43</volume>, <fpage>1138</fpage>&#x2013;<lpage>1139</lpage>. <pub-id pub-id-type="doi">10.1049/el:20071182</pub-id> </citation>
</ref>
<ref id="B7">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gibb</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Browning</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Glover&#x2010;Kapfer</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Jones</surname>
<given-names>K. E.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Emerging Opportunities and Challenges for Passive Acoustics in Ecological Assessment and Monitoring</article-title>. <source>Methods Ecol. Evol.</source> <volume>10</volume>, <fpage>169</fpage>&#x2013;<lpage>185</lpage>. <pub-id pub-id-type="doi">10.1111/2041-210X.13101</pub-id> </citation>
</ref>
<ref id="B8">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Guerrini</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Garcia-Rico</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Pazos-Perez</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Alvarez-Puebla</surname>
<given-names>R. A.</given-names>
</name>
<name>
<surname>Smelling</surname>
<given-names>Seeing.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Smelling, Seeing, Tasting-Old Senses for New Sensing</article-title>. <source>ACS Nano</source> <volume>11</volume>, <fpage>5217</fpage>&#x2013;<lpage>5222</lpage>. <pub-id pub-id-type="doi">10.1021/acsnano.7b03176</pub-id> </citation>
</ref>
<ref id="B9">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hammock</surname>
<given-names>M. L.</given-names>
</name>
<name>
<surname>Chortos</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Tee</surname>
<given-names>B. C.-K.</given-names>
</name>
<name>
<surname>Tok</surname>
<given-names>J.&#x20;B.-H.</given-names>
</name>
<name>
<surname>Bao</surname>
<given-names>Z.</given-names>
</name>
</person-group> (<year>2013</year>). <article-title>25th Anniversary Article: The Evolution of Electronic Skin (E-Skin): A Brief History, Design Considerations, and Recent Progress</article-title>. <source>Adv. Mater.</source> <volume>25</volume>, <fpage>5997</fpage>&#x2013;<lpage>6038</lpage>. <pub-id pub-id-type="doi">10.1002/adma.201302240</pub-id> </citation>
</ref>
<ref id="B10">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Havens</surname>
<given-names>K. J.</given-names>
</name>
<name>
<surname>Sharp</surname>
<given-names>E. J.</given-names>
</name>
</person-group> (<year>2016</year>). <source>Thermal Imaging Techniques to Survey and Monitor Animals in the Wild: A Methodology</source>. <publisher-loc>Amsterdam</publisher-loc>: <publisher-name>Academic Press</publisher-name>. <comment>ISBN 978-0-12-803384-5</comment>. <pub-id pub-id-type="doi">10.1016/C2014-0-03312-6</pub-id> </citation>
</ref>
<ref id="B11">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jolliffe</surname>
<given-names>I. T.</given-names>
</name>
<name>
<surname>Cadima</surname>
<given-names>J.</given-names>
</name>
</person-group> (<year>2016</year>). <article-title>Principal Component Analysis: a Review and Recent Developments</article-title>. <source>Phil. Trans. R. Soc. A.</source> <volume>374</volume>, <fpage>20150202</fpage>. <pub-id pub-id-type="doi">10.1098/rsta.2015.0202</pub-id> </citation>
</ref>
<ref id="B12">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ng</surname>
<given-names>J.&#x20;Y.</given-names>
</name>
<name>
<surname>Neumann</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Davis</surname>
<given-names>L. S.</given-names>
</name>
<name>
<surname>Davis</surname>
<given-names>L. S.</given-names>
</name>
</person-group> (<year>2018</year>). <article-title>ActionFlowNet: Learning Motion Representation for Action Recognition</article-title>, <conf-name>IEEE Winter Conference on Applications of Computer Vision (WACV)</conf-name>, <fpage>1616</fpage>&#x2013;<lpage>1624</lpage>. <pub-id pub-id-type="doi">10.1109/WACV.2018.00179</pub-id> </citation>
</ref>
<ref id="B13">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kim</surname>
<given-names>D.-H.</given-names>
</name>
<name>
<surname>Lu</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Ma</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Kim</surname>
<given-names>Y.-S.</given-names>
</name>
<name>
<surname>Kim</surname>
<given-names>R.-H.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>S.</given-names>
</name>
<etal/>
</person-group> (<year>2011</year>). <article-title>Epidermal Electronics</article-title>. <source>Science</source> <volume>333</volume>, <fpage>838</fpage>&#x2013;<lpage>843</lpage>. <pub-id pub-id-type="doi">10.1126/science.1206157</pub-id> </citation>
</ref>
<ref id="B14">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lee</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Yun</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Song</surname>
<given-names>J.-K.</given-names>
</name>
<name>
<surname>Sunwoo</surname>
<given-names>S.-H.</given-names>
</name>
<name>
<surname>Kim</surname>
<given-names>D.-H.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Nanoscale Materials and Deformable Device Designs for Bioinspired and Biointegrated Electronics</article-title>. <source>Acc. Mater. Res.</source> <volume>2</volume>, <fpage>266</fpage>&#x2013;<lpage>281</lpage>. <pub-id pub-id-type="doi">10.1021/accountsmr.1c00020</pub-id> </citation>
</ref>
<ref id="B15">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lipomi</surname>
<given-names>D. J.</given-names>
</name>
<name>
<surname>Vosgueritchian</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Tee</surname>
<given-names>B. C.-K.</given-names>
</name>
<name>
<surname>Hellstrom</surname>
<given-names>S. L.</given-names>
</name>
<name>
<surname>Lee</surname>
<given-names>J.&#x20;A.</given-names>
</name>
<name>
<surname>Fox</surname>
<given-names>C. H.</given-names>
</name>
<etal/>
</person-group> (<year>2011</year>). <article-title>Skin-like Pressure and Strain Sensors Based on Transparent Elastic Films of Carbon Nanotubes</article-title>. <source>Nat. Nanotech</source> <volume>6</volume>, <fpage>788</fpage>&#x2013;<lpage>792</lpage>. <pub-id pub-id-type="doi">10.1038/NNANO.2011.184</pub-id> </citation>
</ref>
<ref id="B16">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Liu</surname>
<given-names>Q.</given-names>
</name>
<name>
<surname>Jin</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Xie</surname>
<given-names>S.</given-names>
</name>
<etal/>
</person-group> (<year>2021</year>). <article-title>Nanofibrous Grids Assembled Orthogonally from Direct-Written Piezoelectric Fibers as Self-Powered Tactile Sensors</article-title>. <source>ACS Appl. Mater. Inter.</source> <volume>13</volume>, <fpage>10623</fpage>&#x2013;<lpage>10631</lpage>. <pub-id pub-id-type="doi">10.1021/acsami.0c22318</pub-id> </citation>
</ref>
<ref id="B17">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Liu</surname>
<given-names>X.</given-names>
</name>
<name>
<surname>Ren</surname>
<given-names>Z.</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Zhao</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Ling</surname>
<given-names>Q.</given-names>
</name>
<name>
<surname>Gu</surname>
<given-names>H.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Multifunctional Self-Healing Dual Network Hydrogels Constructed via Host-Guest Interaction and Dynamic Covalent Bond as Wearable Strain Sensors for Monitoring Human and Organ Motions</article-title>. <source>ACS Appl. Mater. Inter.</source> <volume>13</volume>, <fpage>14612</fpage>&#x2013;<lpage>14622</lpage>. <pub-id pub-id-type="doi">10.1021/acsami.1c03213</pub-id> </citation>
</ref>
<ref id="B18">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Machado</surname>
<given-names>J.&#x20;C.</given-names>
</name>
<name>
<surname>Shimizu</surname>
<given-names>F. M.</given-names>
</name>
<name>
<surname>Ortiz</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Pinhatti</surname>
<given-names>M. S.</given-names>
</name>
<name>
<surname>Carr</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Guterres</surname>
<given-names>S. S.</given-names>
</name>
<etal/>
</person-group> (<year>2018</year>). <article-title>Efficient Praziquantel Encapsulation into Polymer Microcapsules and Taste Masking Evaluation Using an Electronic Tongue</article-title>. <source>Bcsj</source> <volume>91</volume>, <fpage>865</fpage>&#x2013;<lpage>874</lpage>. <pub-id pub-id-type="doi">10.1246/bcsj.20180005</pub-id> </citation>
</ref>
<ref id="B19">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Minghim</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Paulovich</surname>
<given-names>F. V.</given-names>
</name>
<name>
<surname>de Andrade Lopes</surname>
<given-names>A.</given-names>
</name>
</person-group> (<year>2006</year>). &#x201c;<article-title>Content-based Text Mapping Using Multi-Dimensional Projections for Exploration of Document Collections</article-title>,&#x201d; in <source>Visualization and Data Analysis 2006</source>. <source>Of SPIE-IS&#x26;T Electronic Imaging</source>. Editors <person-group person-group-type="editor">
<name>
<surname>Erbacher</surname>
<given-names>Robert. F.</given-names>
</name>
<name>
<surname>Roberts</surname>
<given-names>Jonathan. C.</given-names>
</name>
<name>
<surname>Gr&#xf6;hn</surname>
<given-names>Matti. T.</given-names>
</name>
<name>
<surname>B&#xf6;rner</surname>
<given-names>Katy.</given-names>
</name>
</person-group> (<publisher-loc>SPIE</publisher-loc>), <volume>6060</volume>, <fpage>60600S</fpage>. <pub-id pub-id-type="doi">10.1117/12.650880</pub-id> </citation>
</ref>
<ref id="B20">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Murugathas</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Zheng</surname>
<given-names>H. Y.</given-names>
</name>
<name>
<surname>Colbert</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Kralicek</surname>
<given-names>A. V.</given-names>
</name>
<name>
<surname>Carraher</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Plank</surname>
<given-names>N. O. V.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Biosensing with Insect Odorant Receptor Nanodiscs and Carbon Nanotube Field-Effect Transistors</article-title>. <source>ACS Appl. Mater. Inter.</source> <volume>11</volume>, <fpage>9530</fpage>&#x2013;<lpage>9538</lpage>. <pub-id pub-id-type="doi">10.1021/acsami.8b19433</pub-id> </citation>
</ref>
<ref id="B21">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Nandhini Abirami</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Durai Raj Vincent</surname>
<given-names>P. M.</given-names>
</name>
<name>
<surname>Srinivasan</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Tariq</surname>
<given-names>U.</given-names>
</name>
<name>
<surname>Chang</surname>
<given-names>C.-Y.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Deep CNN and Deep GAN in Computational Visual Perception-Driven Image Analysis</article-title>. <source>Complexity</source> <volume>2021</volume>, <fpage>1</fpage>&#x2013;<lpage>30</lpage>. <pub-id pub-id-type="doi">10.1155/2021/5541134</pub-id> </citation>
</ref>
<ref id="B22">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Neto</surname>
<given-names>M. P.</given-names>
</name>
<name>
<surname>Soares</surname>
<given-names>A. C.</given-names>
</name>
<name>
<surname>Oliveira</surname>
<given-names>O. N.</given-names>
<suffix>Jr</suffix>
</name>
<name>
<surname>Paulovich</surname>
<given-names>F. V.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Machine Learning Used to Create a Multidimensional Calibration Space for Sensing and Biosensing Data</article-title>. <source>Bcsj</source> <volume>94</volume>, <fpage>1553</fpage>&#x2013;<lpage>1562</lpage>. <pub-id pub-id-type="doi">10.1246/bcsj.20200359</pub-id> </citation>
</ref>
<ref id="B23">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Oliveira, Jr.</surname>
<given-names>O. N.</given-names>
<suffix>Jr</suffix>
</name>
<name>
<surname>Neves</surname>
<given-names>T. T. A. T.</given-names>
</name>
<name>
<surname>Paulovich</surname>
<given-names>F. V.</given-names>
</name>
<name>
<surname>de Oliveira</surname>
<given-names>M. C. F.</given-names>
</name>
</person-group> (<year>2014</year>). <article-title>Where Chemical Sensors May Assist in Clinical Diagnosis Exploring &#x201c;Big Data&#x201d;</article-title>. <source>Chem. Lett.</source> <volume>43</volume>, <fpage>1672</fpage>&#x2013;<lpage>1679</lpage>. <pub-id pub-id-type="doi">10.1246/cl.140762</pub-id> </citation>
</ref>
<ref id="B24">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Paulovich</surname>
<given-names>F. V.</given-names>
</name>
<name>
<surname>De Oliveira</surname>
<given-names>M. C. F.</given-names>
</name>
<name>
<surname>Oliveira</surname>
<given-names>O. N.</given-names>
<suffix>Jr</suffix>
</name>
</person-group> (<year>2018</year>). <article-title>A Future with Ubiquitous Sensing and Intelligent Systems</article-title>. <source>ACS Sens.</source> <volume>3</volume>, <fpage>1433</fpage>&#x2013;<lpage>1438</lpage>. <pub-id pub-id-type="doi">10.1021/acssensors.8b00276</pub-id> </citation>
</ref>
<ref id="B25">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Perinoto</surname>
<given-names>&#xc2;. C.</given-names>
</name>
<name>
<surname>Maki</surname>
<given-names>R. M.</given-names>
</name>
<name>
<surname>Colhone</surname>
<given-names>M. C.</given-names>
</name>
<name>
<surname>Santos</surname>
<given-names>F. R.</given-names>
</name>
<name>
<surname>Migliaccio</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Daghastanli</surname>
<given-names>K. R.</given-names>
</name>
<etal/>
</person-group> (<year>2010</year>). <article-title>Biosensors for Efficient Diagnosis of Leishmaniasis: Innovations in Bioanalytics for a Neglected Disease</article-title>. <source>Anal. Chem.</source> <volume>82</volume>, <fpage>9763</fpage>&#x2013;<lpage>9768</lpage>. <pub-id pub-id-type="doi">10.1021/ac101920t</pub-id> </citation>
</ref>
<ref id="B26">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Qiu</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Zhu</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>Q.</given-names>
</name>
<name>
<surname>Zhu</surname>
<given-names>S.</given-names>
</name>
<etal/>
</person-group> (<year>2021</year>). <article-title>Colorimetric Ionic Organohydrogels Mimicking Human Skin for Mechanical Stimuli Sensing and Injury Visualization</article-title>. <source>ACS Appl. Mater. Inter.</source> <volume>13</volume>, <fpage>26490</fpage>&#x2013;<lpage>26497</lpage>. <pub-id pub-id-type="doi">10.1021/acsami.1c04911</pub-id> </citation>
</ref>
<ref id="B27">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rakow</surname>
<given-names>N. A.</given-names>
</name>
<name>
<surname>Suslick</surname>
<given-names>K. S.</given-names>
</name>
</person-group> (<year>2000</year>). <article-title>A Colorimetric Sensor Array for Odour Visualization</article-title>. <source>Nature</source> <volume>406</volume>, <fpage>710</fpage>&#x2013;<lpage>713</lpage>. <pub-id pub-id-type="doi">10.1038/35021028</pub-id> </citation>
</ref>
<ref id="B28">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Regal</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Troughton</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Djenizian</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Ramuz</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Biomimetic Models of the Human Eye, and Their Applications</article-title>. <source>Nanotechnology</source> <volume>32</volume>, <fpage>302001</fpage>. <pub-id pub-id-type="doi">10.1088/1361-6528/abf3ee</pub-id> </citation>
</ref>
<ref id="B29">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Riul</surname>
<given-names>A.</given-names>
<suffix>Jr</suffix>
</name>
<name>
<surname>Dos Santos</surname>
<given-names>D. S.</given-names>
</name>
<name>
<surname>Wohnrath</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Di Tommazo</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Carvalho</surname>
<given-names>A. C. P. L. F.</given-names>
</name>
<name>
<surname>Fonseca</surname>
<given-names>F. J.</given-names>
</name>
<etal/>
</person-group> (<year>2002</year>). <article-title>Artificial Taste Sensor: Efficient Combination of Sensors Made from Langmuir&#x2212;Blodgett Films of Conducting Polymers and a Ruthenium Complex and Self-Assembled Films of an Azobenzene-Containing Polymer</article-title>. <source>Langmuir</source> <volume>18</volume>, <fpage>239</fpage>&#x2013;<lpage>245</lpage>. <pub-id pub-id-type="doi">10.1021/la011017d</pub-id> </citation>
</ref>
<ref id="B30">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Riul Jr.</surname>
<given-names>A.</given-names>
<suffix>Jr</suffix>
</name>
<name>
<surname>Dantas</surname>
<given-names>C. A. R.</given-names>
</name>
<name>
<surname>Miyazaki</surname>
<given-names>C. M.</given-names>
</name>
<name>
<surname>Oliveira Jr.</surname>
<given-names>O. N.</given-names>
<suffix>Jr</suffix>
</name>
</person-group> (<year>2010</year>). <article-title>Recent Advances in Electronic Tongues</article-title>. <source>Analyst</source> <volume>135</volume>, <fpage>2481</fpage>&#x2013;<lpage>2495</lpage>. <pub-id pub-id-type="doi">10.1039/c0an00292e</pub-id> </citation>
</ref>
<ref id="B31">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rodrigues</surname>
<given-names>J.&#x20;F.</given-names>
<suffix>Jr</suffix>
</name>
<name>
<surname>Florea</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>de Oliveira</surname>
<given-names>M. C. F.</given-names>
</name>
<name>
<surname>Diamond</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Oliveira</surname>
<suffix>Jr.</suffix>
</name>
</person-group> (<year>2021</year>). <article-title>Big Data and Machine Learning for Materials Science</article-title>. <source>Discov. Mater.</source> <volume>1</volume>, <fpage>12</fpage>. <pub-id pub-id-type="doi">10.1007/s43939-021-00012-0</pub-id> </citation>
</ref>
<ref id="B32">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rodrigues</surname>
<given-names>J.&#x20;F.</given-names>
<suffix>Jr</suffix>
</name>
<name>
<surname>Paulovich</surname>
<given-names>F. V.</given-names>
</name>
<name>
<surname>de Oliveira</surname>
<given-names>M. C.</given-names>
</name>
<name>
<surname>de Oliveira</surname>
<given-names>O. N.</given-names>
<suffix>Jr</suffix>
</name>
</person-group> (<year>2016</year>). <article-title>On the Convergence of Nanotechnology and Big Data Analysis for Computer-Aided Diagnosis</article-title>. <source>Nanomedicine</source> <volume>11</volume>, <fpage>959</fpage>&#x2013;<lpage>982</lpage>. <pub-id pub-id-type="doi">10.2217/nnm.16.35</pub-id> </citation>
</ref>
<ref id="B33">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rodriguez-Mendez</surname>
<given-names>M. L.</given-names>
</name>
<name>
<surname>Apetrei</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Gay</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Medina-Plaza</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>de Saja</surname>
<given-names>J.&#x20;A.</given-names>
</name>
<name>
<surname>Vidal</surname>
<given-names>S.</given-names>
</name>
<etal/>
</person-group> (<year>2014</year>). <article-title>Evaluation of Oxygen Exposure Levels and Polyphenolic Content of Red Wines Using an Electronic Panel Formed by an Electronic Nose and an Electronic Tongue</article-title>. <source>Food Chem.</source> <volume>155</volume>, <fpage>91</fpage>&#x2013;<lpage>97</lpage>. <pub-id pub-id-type="doi">10.1016/j.foodchem.2014.01.021</pub-id> </citation>
</ref>
<ref id="B34">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>S&#xe1;nchez-Gendriz</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Padovese</surname>
<given-names>L. R.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Temporal and Spectral Patterns of Fish Choruses in Two Protected Areas in Southern Atlantic</article-title>. <source>Ecol. Inform.</source> <volume>38</volume>, <fpage>31</fpage>&#x2013;<lpage>38</lpage>. <pub-id pub-id-type="doi">10.1016/j.ecoinf.2017.01.003</pub-id> </citation>
</ref>
<ref id="B35">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Shimizu</surname>
<given-names>F. M.</given-names>
</name>
<name>
<surname>Tod&#xe3;o</surname>
<given-names>F. R.</given-names>
</name>
<name>
<surname>Gobbi</surname>
<given-names>A. L.</given-names>
</name>
<name>
<surname>Oliveira</surname>
<given-names>O. N.</given-names>
<suffix>Jr</suffix>
</name>
<name>
<surname>Garcia</surname>
<given-names>C. D.</given-names>
</name>
<name>
<surname>Lima</surname>
<given-names>R. S.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Functionalization-free Microfluidic Electronic Tongue Based on a Single Response</article-title>. <source>ACS Sens.</source> <volume>2</volume>, <fpage>1027</fpage>&#x2013;<lpage>1034</lpage>. <pub-id pub-id-type="doi">10.1021/acssensors.7b00302</pub-id> </citation>
</ref>
<ref id="B36">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Solanki</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Krupanidhi</surname>
<given-names>S. B.</given-names>
</name>
<name>
<surname>Nanda</surname>
<given-names>K. K.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Sequential Elemental Dealloying Approach for the Fabrication of Porous Metal Oxides and Chemiresistive Sensors Thereof for Electronic Listening</article-title>. <source>ACS Appl. Mater. Inter.</source> <volume>9</volume>, <fpage>41428</fpage>&#x2013;<lpage>41434</lpage>. <pub-id pub-id-type="doi">10.1021/acsami.7b12127</pub-id> </citation>
</ref>
<ref id="B37">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wang</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Cui</surname>
<given-names>Q.</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Ning</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>Z.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Monitoring the Withering Condition of Leaves during Black tea Processing via the Fusion of Electronic Eye (E-Eye), Colorimetric Sensing Array (CSA), and Micro-near-infrared Spectroscopy (NIRS)</article-title>. <source>J.&#x20;Food Eng.</source> <volume>300</volume>, <fpage>110534</fpage>. <pub-id pub-id-type="doi">10.1016/j.jfoodeng.2021.110534</pub-id> </citation>
</ref>
<ref id="B38">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Winquist</surname>
<given-names>F.</given-names>
</name>
</person-group> (<year>2008</year>). <article-title>Voltammetric Electronic Tongues - Basic Principles and Applications</article-title>. <source>Microchim Acta</source> <volume>163</volume>, <fpage>3</fpage>&#x2013;<lpage>10</lpage>. <pub-id pub-id-type="doi">10.1007/s00604-007-0929-2</pub-id> </citation>
</ref>
<ref id="B39">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Yang</surname>
<given-names>J.&#x20;C.</given-names>
</name>
<name>
<surname>Mun</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Kwon</surname>
<given-names>S. Y.</given-names>
</name>
<name>
<surname>Park</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Bao</surname>
<given-names>Z.</given-names>
</name>
<name>
<surname>Park</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Electronic Skin: Recent Progress and Future Prospects for Skin&#x2010;Attachable Devices for Health Monitoring, Robotics, and Prosthetics</article-title>. <source>Adv. Mater.</source> <volume>31</volume>, <fpage>1904765</fpage>. <pub-id pub-id-type="doi">10.1002/adma.201904765</pub-id> </citation>
</ref>
</ref-list>
</back>
</article>