Skip to main content

PERSPECTIVE article

Front. Neurosci., 14 May 2021
Sec. Neuromorphic Engineering
This article is part of the Research Topic Insights in Neuromorphic Engineering: 2021 View all 7 articles

Autonomous Flying With Neuromorphic Sensing

  • 1Central Research and Technology, Airbus, Munich, Germany
  • 2U.S. Office of Naval Research Global, London, United Kingdom
  • 3United States Army Research Laboratory, Aberdeen Proving Ground, Maryland, MD, United States
  • 4Airbus Defence and Space GmbH, Manching, Germany
  • 5U.S. Army Research Laboratory, London, United Kingdom
  • 6Department of Bioengineering, Imperial College London, London, United Kingdom
  • 7Institut de la Vision, INSERM UMRI S 968, Paris, France
  • 8Biomedical Science Tower, University of Pittsburgh, Pittsburgh, PA, United States
  • 9Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, United States
  • 10Micro Air Vehicle Laboratory, Department of Control and Operations, Faculty of Aerospace Engineering, Delft University of Technology, Delft, Netherlands
  • 11The Neuroethological lab, Department of Neurobiology, The Rappaport Institute for Biomedical Research, Technion – Israel Institute of Technology, Haifa, Israel
  • 12Brain Research Center/Institute of Systems Neuroscience, National Tsing Hua University, Hsinchu, Taiwan
  • 13Laboratory of Comparative Neural Systems and Behavior, Department of Psychological and Brain Sciences, Neuroscience and Mechanical Engineering, Johns Hopkins University, Baltimore, MD, United States

Autonomous flight for large aircraft appears to be within our reach. However, launching autonomous systems for everyday missions still requires an immense interdisciplinary research effort supported by pointed policies and funding. We believe that concerted endeavors in the fields of neuroscience, mathematics, sensor physics, robotics, and computer science are needed to address remaining crucial scientific challenges. In this paper, we argue for a bio-inspired approach to solve autonomous flying challenges, outline the frontier of sensing, data processing, and flight control within a neuromorphic paradigm, and chart directions of research needed to achieve operational capabilities comparable to those we observe in nature. One central problem of neuromorphic computing is learning. In biological systems, learning is achieved by adaptive and relativistic information acquisition characterized by near-continuous information retrieval with variable rates and sparsity. This results in both energy and computational resource savings being an inspiration for autonomous systems. We consider pertinent features of insect, bat and bird flight behavior as examples to address various vital aspects of autonomous flight. Insects exhibit sophisticated flight dynamics with comparatively reduced complexity of the brain. They represent excellent objects for the study of navigation and flight control. Bats and birds enable more complex models of attention and point to the importance of active sensing for conducting more complex missions. The implementation of neuromorphic paradigms for autonomous flight will require fundamental changes in both traditional hardware and software. We provide recommendations for sensor hardware and processing algorithm development to enable energy efficient and computationally effective flight control.

Introduction

Autonomous flying capability in aerospace is gathering momentum with the needs for enhanced safety, sustainability, and new missions. The latest developments in data science and machine learning have accelerated the relevance of autonomy for many areas, such as space operations, unmanned aerial vehicles (UAVs), (passenger) transport, manned-unmanned teaming, and air traffic management (National Research Council [NRC], 2014; Airbus, 2020a). In future scenarios, coordination of a vast number of participants in contested air space is needed, which can be answered by autonomous and self-organizing vehicles only with unique sensing capabilities.

The latest (pilot-controlled) Airbus A350 has 50,000 sensors on board collecting 2.5 terabytes of data every day (Airbus, 2020b). Processing the amount of sensory data needed for autonomous flight by current processing technologies requires high-performance computers consuming many kilowatts of energy. In contrast, a bird has two eyes, two ears, two vestibular organs, skin, several hundreds of olfactory receptors, 25–500 taste buds, and is capable of magneto reception (Wiltscho and Wiltscho, 2019; Ornithology.com, 2020). Considering that the largest bird brains weigh 20 g, have 3 billion neurons and consume ∼0.2 W (Olkowicz et al., 2016), we need a new paradigm in bio-inspired sensors and processing, to overcome the computational and energy constraints required for autonomous operation in complex three-dimensional high-speed environments.

Neuromorphic engineering describes a large-scale system of integrated circuits that mimic neurobiological architectures present in the nervous system (Ma and Krings, 2009; Morabito et al., 2013; Zhang et al., 2020). It aims to harness the efficiencies of biological brains by developing essential computation analogs of neuronal circuits. To explore the potential of neuromorphic engineering for aviation sensors, a virtual workshop on the topic of “Neuromorphic Sensing and Processing Paradigm” was organized by Airbus, U.S. Office of Naval research Global (ONRG) and the U.S. DEVCOM Army Research Laboratory (ARL) in July 2020.

This perspective paper is one output of the workshop and aims to describe the needs, current advances and provide directions for future research related to neuromorphic sensing and processing enablers for autonomous flight missions. The goal is to call upon the scientific community and funding agencies to enable and perform the required multidisciplinary research in this area.

Missions and Needs

Autonomous flight for passenger aircraft has the potential to deliver increased fuel savings, reduce operating costs, and allow pilots to focus on strategic decision-making and mission management rather than aircraft management. This is enabled by image-processing, decision-making support, speech processing and interpretation, and mutual human-machine trust. Urban air mobility (UAM) also requires new roles and capabilities, such as safe ground control of multiple autonomous pilot-less vehicles. Moreover, autonomy is extremely important for the next generation of defense and security air systems, where one aim is to connect manned and unmanned components in operations. Another (part of this) system can be a “swarm” of unmanned vehicles that independently coordinate to perform missions. This requires robust navigation in potentially obscure conditions, for potential missions such as reconnaissance, disaster management after catastrophes, search and rescue at sea or on ground, surveillance, automatic recognition of objects and/or people, border security, imaging, and optimized air traffic control.

General vital capabilities to achieve autonomous flight are attitude control (Goulard et al., 2016), height control and landing (Srinivasan et al., 2000; Baird et al., 2013), collision avoidance (Tammero and Dickinson, 2002; Rind et al., 2016), and navigating between places of interest (Webb, 2019). For search and rescue and security missions, high-speed vehicles able to operate in complex environments require fast response times. On the other end are long-term (>1 day) missions with a great need for low energy consumption where swarms of flying autonomous vehicles could bring benefits in terms of cost, safety, and performance. A wide range of capabilities is required, such as communication channels, predictive tracking, control of own-drone space (situational awareness), mission-dependent information, visual/spectral tracking, navigation and control. For human-autonomy teaming missions, sharing of environmental sensory information, including temperature, humidity, airflow, and olfactory inputs (smoke, biohazards, explosives, etc.) is of major importance. For space missions, autonomous navigation and situational awareness are required in, for example, space-debris collision prevention. These exquisite capabilities can be enabled by a combination of multiple centralized and distributed sensors, such as observed in biology. The multi-domain missions as described above require simultaneous sensing modalities where neuromorphic sensors may need to be combined with conventional ones, such as hyperspectral imaging.

To enable sensing and efficient real-time processing for future missions and to address real world challenges, foundational advances in neuromorphic computing are needed at the theoretical, computational, algorithmic, and hardware implementation levels to overcome today’s scalability limit of e.g. finite state machines. For example, just as insects integrate information from vision and mechanoreceptors to instantaneously correct for changes in e.g., airspeed (Buckminster Fuller et al., 2014), autonomous flying vehicles also require efficient and adaptive sensorimotor computations for instantaneous flight correction. Another challenge is navigating through complex dynamic environments with obstacle avoidance, such as flying through treetops with wind blowing through leaves and twigs. This requires luminance normalization (Hung et al., 2020), coupled with contextual processing of higher order features, environmental and task-related visual statistics, as well as feedback and recurrent processing for visual prediction. Whereas in biology such normalization, contextualization, and prediction are interdependent processes, there still exists a major gap in computational implementation today.

An ultimate task of autonomy, the ability to deal with unexpected situations, requires the system to learn and act upon sensory inputs, which is the embodiment of cognition. This requirement represents another computational challenge that may be met by neuromorphic implementation. A robust computational framework must be developed to realize learning in autonomy that mimics, to some degree, even simple animals. The inability to provide training data for all scenarios poses a challenge for full autonomous flight to meet commercial, defense, security and space mission needs of the future. Given the vast area of research in neuromorphic processing, we only treat learning and decision-making in the context of sensing here in our discussion.

Current Advances in Neuromorphic Sensing

Neuromorphic sensing occurs through a device that perceives changes in a certain parameter and outputs them into a stream of events (“spikes”) (Morabito et al., 2013). Thus, essentially any sensing modality can be converted into an event-based (EB) detector. For example, pixels in EB cameras can signal brightness changes exceeding some threshold; microphones in EB audio sensors can react to sound intensity variations in certain frequency ranges (Li et al., 2012; Liu et al., 2014; Inilabs.com, 2020); EB chemical or olfactory sensor arrays or bio-hybrid sensors can fire as a result of chemical element concentration deviations (Koickal et al., 2007; Chiu and Tang, 2013; Vanarse et al., 2017; Anderson et al., 2020); EB tactile sensors respond to changes in force and movement (Kim et al., 2018; Baghaei Naeini et al., 2020).

The most prevalent example of EB sensors are video cameras (Lichtsteiner et al., 2008; Posch et al., 2011; Chen et al., 2012; Brandli et al., 2014; Son et al., 2017; Taverni et al., 2018; Clarke, 2020), which despite their relatively recent birth have already experienced significant evolution. Event based cameras have the necessary properties to become the foundational sensors for autonomous flight, solving the challenges of power, computing, and timing requirements and enabling local decision-making. In addition to being affordable and fast, they offer independence from lighting conditions and a large dynamic range. Event based sensors can process the equivalent of several hundred kHz frames using conventional adaptive CPU hardware for computations that are impossible to carry out with classic frame-based sensors. Examples are: real-time optical flow (Benosman et al., 2014), pose estimation (Reverter Valeiras et al., 2016, 2018), time-based machine learning (Lagorce et al., 2017), aperture free optical flow (Akolkar et al., 2020), and many other applications that have been now developed using pure temporal mechanisms performed at almost real-time speeds.

Another neuromorphic approach to active sensing, includes the world’s first spiking neural network-based chip that was announced recently for radar signal processing. The first application was reported to encompass the creation of a low-power smart anti-collision radar system for drone collision avoidance; future plans are to process a variety of active sensor data including electrocardiogram, speech, sonar, radar and LIDAR streams (Liu, 2020). Reportedly, the chip consumes 100 times less power than traditional implementations and provides 10X reduction in latency.

The algorithmic and hardware transitions to EB sensing platforms are driven by the desire to reduce latency, to achieve orders of magnitude improvement in energy efficiency, dynamic range, and sensitivity, to solve complex control problems with limited computing resources and to attain autonomous system’s capability of adapting to operation in unpredictable dynamic environments. Hence recently, they have been applied successfully in space surveillance applications (Roffe et al., 2021) and for controlled landing of micro-air vehicles (Dupeyroux et al., 2020). However, despite the progress achieved in the last decade by state-of-the-art neuromorphic sensors, there are several fundamental barriers separating them from real life applications. For example, visual EB sensors have limited ability to handle high focal plane array utilization due to complex illumination or clutter as well as pixel response inhomogeneity. In terms of sensor data processing, a major current challenge is to develop spike neural network learning principles, concurrently advancing both the algorithms and hardware, to enable the disparate sensor data fusion inherent to biological sensing.

Bio-Inspiration From Flying Animals and Insects

To address the challenges of energy-efficient real-time processing of multiple sensing modalities, and the ability to deal with unexpected situations as described above, we can look toward flying animals and insects. For example, spatial navigation builds upon a network of cognitive functions. Animals that rely on active sensing present particularly powerful models to guide the implementation of cognitive functions in the design of autonomous navigation systems, as their actions reflect cognitive states and directly influence signals used to represent the environment, which, in turn, guide 3D movement. Echolocating bats, for example, transmit sonar cries, and use information carried by returning echoes to determine the 3D position, size, and other features of objects in the environment (Simmons and Vernon, 1971; Simmons, 1973; Griffin, 1974; Busnel and Fish, 1980; Nachtigall and Moore, 1988; Moss and Schnitzler, 1989; Thomas et al., 2004; Moss and Surlykke, 2010). Central to successful 3D navigation of autonomous vehicles in cluttered environments is spatial attention. Attention invokes mechanisms that allocate computational resources to selectively process and enhance relevant information from the environment (Broadbent, 1957). It has been demonstrated in bats that sonarguided attention drives state-dependent processing of echo returns in brain regions that play key roles in spatial navigation. Specifically, attention invokes the sharpening of auditory midbrain neurons that encode an object’s 3D location in egocentric coordinates and hippocampal place cells that encode an animal’s 3D location in allocentric coordinates (Yartsev and Ulanovsky, 2013; Kothari et al., 2018; Wohlgemuth et al., 2018). Moreover, bats adjust the directional aim and temporal patterning of echolocation signals to inspect objects in the environment, which reveals adaptive sonar signal design, tailored to the task at hand. Current sonar technologies do not yet implement adaptive signal transmission and processing, which may explain superior performance of animals over artificial devices. New technologies that incorporate more complete knowledge of animal echolocation systems pave the way for advanced 3D sonar-guided navigation of autonomous vehicles in dark and cluttered environments.

For barn owls, as efficient predators adapted for hunting rodents in extremely low light conditions, attention is also of key importance to prevent overload of information processing capacity and for stable behavioral control in face of multiple distractors in cluttered and noisy environments (Posner, 2016). Neurons in the optic tectum of barn owls respond preferentially to visual and auditory stimuli that break a regular pattern of their background (Zahar et al., 2018) or that are categorically stronger from competing stimuli (Mysore et al., 2011). Unique neural adaptation (Netser et al., 2011), circuit motifs (Mysore and Knudsen, 2012) and top down modulation (Winkowski and Knudsen, 2008) that facilitate the stimulus selection process at the neuronal level have been identified. Such insights on the mechanisms of the barn owl’s neural activity may teach us information-processing strategies that are efficient and behaviorally useful.

The barn owl’s intriguing capability to avoid obstacles in dark conditions seems to rely on spatial memory and strong sense of self position in the memorized map of space (Payne, 1971). Preliminary results have begun to reveal the neural representation of the owl’s location and direction in space and provide a promising avenue for new inspirations about autonomous aviation in extreme light conditions (Agarwal and Gutfreund, 2019).

Another entry point is to focus on insect brains that are known to be of lesser complexity and therefore more adapted to modeling. Insect brains are particularly adapted to resource-constrained scenarios as in the case of flying machines while showing amazing functionalities that allow them to perform complicated tasks with great ease such as (visual) collision avoidance, localization, communication, navigation, odor source localization and social interaction in unknown unpredictable environments (Smith, 2007; Schneider and Levine, 2014; Weir and Dickinson, 2015; Fu et al., 2019; Huang et al., 2019; Hu et al., 2020).

For example, Drosophila melanogaster, or fruit fly, has a small brain with ∼100,000 neurons that are highly efficient in sensory processing. Although having an extremely poor spatial resolution (only ∼800 pixels in each compound eye), the visual system of fruit flies and other type of flies are highly sensitive to movements (Borst et al., 2010; Borst, 2018), inspiring the development of EB cameras described earlier. Moreover, there are several downstream layers of neural circuits that compute and process optical flow (Weir and Dickinson, 2015; Mauss and Borst, 2020). This innate neural “computation” endows a fly with the abilities to detect threats (high-speed moving objects), avoid obstacles, control its flight course (Mauss and Borst, 2020), and estimate its orientation (Su et al., 2017), which is exactly what we need for our autonomous flying vehicles, without the need for more than rudimentary object recognition and classification, which are computationally expensive in today’s artificial neural network architectures. Other examples are represented by locusts and flies that can detect visual collisions by a special neural structure (Fu et al., 2019). Recently, it has been modified into bio-plausible neural models that were applied on autonomous mobile robots and also UAVs with constrained computational resources (Huang et al., 2019; Hu et al., 2020).

Outlook Toward Advancing Neuromorphic Applications

So, how can we translate nature’s amazing capabilities into autonomous flying vehicles with limited energy supply? For example, flying nano-drones (Ma et al., 2013; Karásek et al., 2018) mimicking capabilities of fruit flies will unlock novel opportunities, such as development of nano-drone swarms that can explore and monitor unknown indoor or outdoor environments (Brambilla et al., 2013; McGuire et al., 2019). Recently, a neuromorphic chip was used in the control loop of a flying drone able to perform smooth optical flow landings, like honeybees, which immediately illustrated the energy efficiency and speed promised for such neuromorphic applications (Dupeyroux et al., 2020). Computational principles of dynamic vision of fruit flies or other insects can be implemented together with EB cameras, and used in parallel with the slower and energetically demanding computer vision systems that are designed for object recognition. This way a UAV can detect obstacles or potential threats even before they are recognized while using low-powered passive vision. For larger aircraft, advances in neuromorphic computing could lead to improved sensing and sensory fusion, including real-world resilience and prediction, inspired by the role of biological recurrent networks in solving such challenges (Tang et al., 2018; Kubilius et al., 2019).

Autonomous flight with neuromorphic efficiency requires a cross-disciplinary effort, and EB temporal computation requires new thinking. One missing piece of the puzzle to create truly neuromorphic systems is the computational hardware. It is expected that such architecture will be extremely low power while allowing to truly operate in real-time at the native resolution of the sensor. An optimal solution would be to approach the problem by considering the timing of events and developing techniques where each event incrementally adds information to what has already been computed (Benosman et al., 2014). This idea of local computation is not new and has been described to be present in most real neural networks where the layers of neurons process information at their own independent time scales based on the received sensory data rather than relying upon any form of “global” clock or memory.

Potential advantages of neuromorphic computing for active vs. passive imaging (e.g., bat echo-location vs. owl vision) should also be explored. Active vision (e.g., dense LIDAR) can provide limited 3D sensing but is challenged by objects such as trees, poles, and traffic lights (Chauhan et al., 2014), whereas passive imaging is preferred for maintaining stealth but is energetically expensive. Both areas of research have been dominated by convolutional approaches, and an open question is how to fuse active and passive sensors, including antennae, and their data for efficient, resilient, and adaptive sensory-decision-motor loops. Sensory information in biological brains is represented in the same way for all signals, there is occurrence and temporal correlation, no distinction between inputs, and a generic way of generating events triggered by the data. This saves a lot of time and energy and EB sensors could aid in emulating biology in this sense.

Top-down (internal guidance – mission driven) and bottom-up (externally driven) attention (Katsuki and Constantinidis, 2014) are the neural processes that may solve the bottleneck issue in sensory information processing. With these two types of attention mechanisms, our brain’s central executive unit is able to focus on mission-relevant sensory signals while maintaining flexibility in rapidly switching to other sensory signals that occur unexpectedly. Predictive coding might play a crucial role here because it generates and updates an internal representation (or mental model) of the environment, and attention is required only when a prediction error occurs, which causes the system to shift to a high-power mode.

An additional layer of complexity is presented by neuromorphic computing inspired by biological principles for learning, which is needed for adaptive, resilient, and resource-efficient distributed sensing and learning (i.e., by swarms and other sensors) (Thomas, 1997; Abdelzaher et al., 2018), e.g., of target signatures and harsh environmental conditions, and for assured low-bandwidth communication. Progress on these challenges would create a framework of foundational principles, e.g., for predicting patterns and decisions from complex dynamics.

We have identified a clear need to enhance understanding of neurosensory systems in nature’s flying creatures, which shall result in new and better mathematical models needed for autonomous flying vehicles, see Figure 1. The long-term goal is hardware and software design and prototyping for interacting autonomous vehicles. Our target is neuromorphic hardware that aims at mimicking the functions of neural cells in custom synthetic hardware that is analog, digital, and asynchronous in its nature of information processing and is vastly more energy-efficient and lighter than classical silicon circuitry. It is expected that such a neuromorphic technology will disrupt existing solutions and be a key enabler for real-time processing of different sensor modalities by lower cost, lower energy consumption, lower weight, adaptable to changing missions, while providing enhanced and resilient performance and saving human lives.

FIGURE 1
www.frontiersin.org

Figure 1. Schematic route from bio-inspired behaviors toward neuromorphic sensors for autonomous flight. Animal figures are all covered by copyright with Creative Commons through https://www.pexels.com.

In Table 1, we have created an overview of the current challenges toward autonomous flight and how the biology of flying creatures can inspire us in the coming years to reach the desired milestones. To summarize our recommendations:

TABLE 1
www.frontiersin.org

Table 1. Neuromorphic sensing for autonomous capabilities – roadmap.

1. Develop EB neuromorphic sensor hardware and processing algorithms to support resilient and efficient navigation and collision avoidance

2. Develop computationally efficient flight control with fast sensor-to-actuator responses to support resilience

3. Develop neuromorphic attentional, sensory fusion, and representational mechanisms to increase efficiency and goal-directed performance in complex scenarios

4. Develop neuromorphic approaches to learning for adaptive and predictive sensing and control

5. Develop principles to integrate neuromorphic and convolutional approaches to harness their mutual advantages

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author/s.

Author Contributions

PP created the outline and wrote introduction, collected inputs, and integrated them in the full manuscript. AK and CH wrote sections and reviewed manuscript. YG, C-CL, RB, CM, and GC reviewed the manuscript and contributed sections that were integrated in the manuscript by PP. AS and FG reviewed the manuscript. All authors contributed to the article and approved the submitted version.

Funding

The barn owl research is partly funded by grants from the Israel Science Foundation (ISF) and the Rappaport foundation for Biomedical research. C-CL is partially supported by the Ministry of Science and Technology (Taiwan) grant number 109-2218-E-007-021. NSF Brain Initiative [NCS-FO 1734744 (2017-2021)], Air Force Office of Scientific Research (FA9550-14-1-0398NIFTI), and Office of Naval Research (N00014-17-1-2736) grants to CM supported research reported in this review.

Disclaimer

The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the DEVCOM Army Research Laboratory or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation herein.

Conflict of Interest

PP and AS are employed by the company Airbus Defence and Space GmbH.

The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

Airbus (2020a). Autonomous Flight – Towards a World of More Autonomous Air Travel. Available online at: https://www.airbus.com/innovation/autonomous-and-connected/autonomous-flight.html,, [accessed on Oct 5, th Oct 2020].

Google Scholar

National Research Council [NRC] (2014). Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press, doi: 10.17226/18815

CrossRef Full Text | Google Scholar

Airbus (2020b). Data Revolution in Aviation. Available online at: https://www.airbus.com/public-affairs/brussels/our-topics/innovation/data-revolution-in-aviation.html, [accessed on 11 Nov 11, 2020].

Google Scholar

Olkowicz, S., Kocourek, M., Luèan, R. K., Porteš, M., Fitch, W. T., Herculano-Houzel, S., et al. (2016). Birds have primate-like numbers of neurons in the forebrain. PNAS 113, 7255–7260. doi: 10.1073/pnas.1517131113

PubMed Abstract | CrossRef Full Text | Google Scholar

Morabito, F. C., Andreou, A. G., and Chicca, E. (2013). Editorial – Neuromorphic Engineering: from neural systems to brain-like engineered systems. Neural. Networks 45, 1–3. doi: 10.1016/j.neunet.2013.07.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Ma, Z., and Krings, A. W. (2009). Insect sensory systems inspired computing and communications. Ad. Hoc. Networks 7, 742–755. doi: 10.1016/j.adhoc.2008.03.003

CrossRef Full Text | Google Scholar

Zhang, W., Gao, B., Tang, J., Yao, P., Yu, S., Chang, M.-F., et al. (2020). Neuro-inspired computing chips. Nat. Electr. 3, 371–382. doi: 10.1038/s41928-020-0435-7

CrossRef Full Text | Google Scholar

Lichtsteiner, P., Posch, C., and Delbruck, T. (2008). A 128x128 120 dB 15μs Latency Asynchronous Temporal Contrast Vision Sensor. IEEE J. Solid State Cir. 43, 566–576. doi: 10.1109/JSSC.2007.914337

CrossRef Full Text | Google Scholar

Posch, C., Matolin, D., and Wohlgenannt, R. (2011). A QVGA 143 dB Dynamic Range Frame-Free PWM Image Sensor With Lossless Pixel-Level Video Compression and Time-Domain CDS. IEEE J. Solid State Cir. 46, 259–275. doi: 10.1109/JSSC.2010.2085952

CrossRef Full Text | Google Scholar

Brandli, C., Berner, R., Yang, M., Liu, S., and Delbruck, T. (2014). A 240 × 180 130 dB 3 μs Latency Global Shutter Spatiotemporal Vision Sensor. IEEE J. Solid State Cir. 49, 2333–2341. doi: 10.1109/JSSC.2014.2342715

CrossRef Full Text | Google Scholar

Taverni, G., Moeys, D. P., and Li, C. (2018). Front and Back Illuminated Dynamic and Active Pixel Vision Sensors Comparison. IEEE Trans. Cir. Syst. II 65, 677–681. doi: 10.1109/TCSII.2018.2824899

CrossRef Full Text | Google Scholar

Clarke, P. (2020). Swiss Event-Based Vision Startup Launches Next-Gen Chip. Available online at: https://www.eenewsanalog.com/news/swiss-event-based-vision-startup-launches-next-gen-chip, [accessed on Aug 21, 2019Oct 2020]

Google Scholar

Son, B., Suh, Y., Jung, H., and Kim, J.-S. (2017). “A 640×480 dynamic vision sensor with a 9μm pixel and 300Meps address-event representation,” in 2017 IEEE International Solid-State Circuits Conference (ISSCC), (San Francisco: IEEE), 66–67. doi: 10.1109/ISSCC.2017.7870263

CrossRef Full Text | Google Scholar

Chen, S., Tang, W., and Zhang, X. (2012). “A 64x 64 Pixels UWB Wireless Temporal-Difference Digital Image Sensor,” in IEEE Transactions on Very Large Scale Integration (VLSI) Systems, (San Francisco: IEEE), 2232–2240. doi: 10.1109/TVLSI.2011.2172470

CrossRef Full Text | Google Scholar

Li, C., Delbruck, T., and Liu, S. (2012). “Real-time speaker identification using the AEREAR2 event-based silicon cochlea,” in 2012 IEEE International Symposium on Circuits and Systems (ISCAS), Seoul, (San Francisco: IEEE), 1159–1162. doi: 10.1109/ISCAS.2012.6271438

CrossRef Full Text | Google Scholar

Liu, S., van Schaik, A., Minch, B. A., and Delbruck, T. (2014). Asynchronous Binaural Spatial Audition Sensor With 2x64x4 Channel Output. IEEE Trans. Biomed. Cir. Syst. 8, 453–464. doi: 10.1109/TBCAS.2013.2281834

PubMed Abstract | CrossRef Full Text | Google Scholar

Inilabs.com (2020). Dynamic Audio Sensor. URL: https://inilabs.com/products/dynamic-audio-sensor/. Accessed Oct 2020

Google Scholar

Koickal, T. J., Hamilton, A., Tan, S. L., Covington, J. A., and Gardner, J. W. (2007). “Analog VLSI circuit implementation of an adaptive neuromorphic olfaction chip,” in IEEE Transactions on Circuits and Systems I: Regular Papers, (San Francisco: IEEE), Vol. 54, 60–73. doi: 10.1109/TCSI.2006.888677

CrossRef Full Text | Google Scholar

Chiu, S.-W., and Tang, K.-T. (2013). Towards a chemiresistive sensor-integrated electronic nose: a review. Sensors 13, 14214–14247. doi: 10.3390/s131014214

PubMed Abstract | CrossRef Full Text | Google Scholar

Vanarse, A., Osseiran, A., and Rassau, A. (2017). An Investigation into Spike-Based Neuromorphic Approaches for Artificial Olfactory Systems. Sensors 17:2591. doi: 10.3390/s17112591

PubMed Abstract | CrossRef Full Text | Google Scholar

Baghaei Naeini, F., AlAli, A. M., Al-Husari, R., and Rigi, A. (2020). A Novel Dynamic-Vision-Based Approach for Tactile Sensing Applications. IEEE Trans. Instrum. Measur. 69, 1881–1893. doi: 10.1109/TIM.2019.2919354

CrossRef Full Text | Google Scholar

Kim, Y., Chortos, A., Xu, W., Liu, Y., Oh, J. Y., Son, D., et al. (2018). A bioinspired flexible organic artificial afferent nerve. Science 360, 998–1003. doi: 10.1126/science.aao0098

PubMed Abstract | CrossRef Full Text | Google Scholar

Liu, J. (2020). Press Release: IMEC Builds World’s First Spiking Neural Network-Based Chip for Radar Signal Processing. Available online at: https://www.imec-int.com/en/articles/imec-builds-world-s-first-spiking-neural-network-based-chip-for-radar-signal-processing, accessed Oct 2020[accessed on April 28, 2020].

Google Scholar

Buckminster Fuller, S., Straw, A. D., Peek, M. Y., Murray, R. M., and Dickinson, M. H. (2014). Flying Drosophila stabilise vision-based velocity controller by sensing wind with their antennae. PNAS 111, E1182–E1191. doi: 10.1073/pnas.1323529111

PubMed Abstract | CrossRef Full Text | Google Scholar

Hung, C. P., Callahan-Flintoft, C., Fedele, P. D., Fluitt, K. F., Odoemene, O., Walker, A. J., et al. (2020). Abrupt darkening under high dynamic range (HDR) luminance invokes facilitation for high-contrast targets and grouping by luminance similarity. J. Vis. 20, 1–16. doi: 10.1167/jov.20.7.9

PubMed Abstract | CrossRef Full Text | Google Scholar

Chauhan, I., Brenner, C., Garg, R. D., and Parida, M. (2014). A new approach to 3D dense LiDAR data classification in urban environment. J. Ind. Soc. Remote Sens. 42, 673–678. doi: 10.1007/s12524-013-0354-4

CrossRef Full Text | Google Scholar

Abdelzaher, T., Ayanian, N., and Basar, T. (2018). Toward an internet of battlefield things: a resilience perspective. Computer 51, 24–36. doi: 10.1109/MC.2018.2876048

CrossRef Full Text | Google Scholar

Thomas, A. L. R. (1997). On the tails of birds. BioScience 47, 215–225. doi: 10.2307/1313075

CrossRef Full Text | Google Scholar

Wiltscho, R., and Wiltscho, W. (2019). Magnetoreception in birds. J. R. Soc. Interf. 16:20190295. https://royalsocietypublishing.org/doi/doi: 10.1098/rsif.2019.0295. doi: 10.1098/rsif.2019.0295

PubMed Abstract | CrossRef Full Text | Google Scholar

Benosman, R. B., Clercq, C., Lagorce, X., Ieng, S. H., and Bartolozzi, C. (2014). Event-based visual flow. IEEE Trans. Neural Netw. Learn. Syst. 25, 407–417. doi: 10.1109/tnnls.2013.2273537

PubMed Abstract | CrossRef Full Text | Google Scholar

Reverter Valeiras, D., Orchard, G., Ieng, S. H., and Benosman, R. B. (2018). Neuromorphic event-based 3d pose estimation. Front. Neurosci. 9:522. doi: 10.3389/fnins.2015.00522

PubMed Abstract | CrossRef Full Text | Google Scholar

Reverter Valeiras, D., Kime, S., Ieng, S. H., and Benosman, R. B. (2016). An event-based solution to the perspective-n-point problem. Front. Neurosci. 10:208. doi: 10.3389/fnins.2016.00208

PubMed Abstract | CrossRef Full Text | Google Scholar

Lagorce, X., Orchard, G., Galluppi, F., Shi, B. E., and Benosman, R. B. (2017). Hots: a hierarchy of event-based time-surfaces for pattern recognition. IEEE Trans. Patt. Analys. Mach. Intell. 39, 1346–1359. doi: 10.1109/tpami.2016.2574707

PubMed Abstract | CrossRef Full Text | Google Scholar

Akolkar, H., Ieng, S. H., and Benosman, R. B. (2020). Real-time high speed motion prediction using fast aperture-robust event-driven visual flow. arXiv preprint arXiv:1811.11135 [preprint].

Google Scholar

Goulard, R., Vercher, J. L., and Viollet, S. (2016). To crash or not to crash: how do hoverflies cope with free-fall situations and weightlessness? J. Exp. Biol. 219, 2497–2503. doi: 10.1242/jeb.141150

PubMed Abstract | CrossRef Full Text | Google Scholar

Baird, E., Boeddeker, N., Ibbotson, M. R., and Srinivasan, M. V. (2013). universal strategy for visually guided landing. Proc. Natl. Acad. Sci. U. S. A. 110, 18686–18691. doi: 10.1073/pnas.1314311110

PubMed Abstract | CrossRef Full Text | Google Scholar

Srinivasan, M. V., Zhang, S. W., Chahl, J. S., Barth, E., and Venkatesh, S. (2000). How honeybees make grazing landings on flat surfaces. Biol. Cyber. 83, 171–183. doi: 10.1007/s004220000162

PubMed Abstract | CrossRef Full Text | Google Scholar

Rind, F. C., Wernitznig, S., Pölt, P., Zankel, A., Gütl, D., Sztarker, J., et al. (2016). Two identified looming detectors in the locust: ubiquitous lateral connections among their inputs contribute to selective responses to looming objects. Scient. Rep. 6:35525.

Google Scholar

Tammero, L. F., and Dickinson, M. H. (2002). Collision-avoidance and landing responses are mediated by separate pathways in the fruit fly, Drosophila melanogaster. J. Exp. Biol. 205, 2785–2798.

Google Scholar

Webb, B. (2019). The internal maps of insects. J. Exp. Biol. 222:jeb188094. doi: 10.1242/jeb.188094

PubMed Abstract | CrossRef Full Text | Google Scholar

Griffin, D. R. (1974). Listening in the Dark: The Acoustic Orientation of Bats and Men. USA: Dover Publications.

Google Scholar

Simmons, J. A. (1973). The Resolution of Target Range by Echolocating Bats. J. Acoust. Soc. Am. 54, 157–173. doi: 10.1121/1.1913559

CrossRef Full Text | Google Scholar

Simmons, J. A., and Vernon, J. A. (1971). Echolocation: discrimination of Targets by the Bat, Eptesicus Fuscus. J. Exp. Zool. 176, 315–328. doi: 10.1002/jez.1401760307

PubMed Abstract | CrossRef Full Text | Google Scholar

Busnel, R. G., and Fish, J. F. (eds) (1980). General Bibliography. Animal Sonar Systems. New York: Plenum Publishing.

Google Scholar

Nachtigall, P. E., and Moore, P. W. B. (eds) (1988). Animal Sonar: Processes and Performance. Boston, MA: Springer.

Google Scholar

Moss, C. F., and Schnitzler, H. U. (1989). Accuracy of Target Ranging in Echolocating Bats: acoustic Information Processing. J. Comp. Physiol. A 165, 383–393. doi: 10.1007/bf00619357

CrossRef Full Text | Google Scholar

Thomas, J. A., Moss, C. F., and Vater, M. (eds) (2004). Echolocation in Bats and Dolphins. Chicago: University of Chicago Press.

Google Scholar

Moss, C. F., and Surlykke, A. (2010). Probing the Natural Scene by Echolocation in Bats. Front. Behav. Neurosci. 4:33. doi: 10.3389/fnbeh.2010.00033

PubMed Abstract | CrossRef Full Text | Google Scholar

Kothari, N. B., Wohlgemuth, M. J., and Moss, C. F. (2018). Dynamic Representation of 3D Auditory Space in the Midbrain of the Free-Flying Echolocating Bat. eLife 7:e29053. doi: 10.7554/eLife.29053

PubMed Abstract | CrossRef Full Text | Google Scholar

Wohlgemuth, M. J., Yu, C., and Moss, C. F. (2018). 3D Hippocampal Place Field Dynamics in Free-Flying Echolocating Bats. Front. Cell. Neurosci. 12:270. doi: 10.3389/fncel.2018.00270

PubMed Abstract | CrossRef Full Text | Google Scholar

Yartsev, M. M., and Ulanovsky, N. (2013). Representation of Three-Dimensional Space in the Hippocampus of Flying Bats. Science. 340, 367–72. doi: 10.1126/science.1235338

PubMed Abstract | CrossRef Full Text | Google Scholar

Borst, A., Haag, J., and Reiff, D. F. (2010). Fly Motion Vision. Ann. Rev. Neurosci. 33, 49–70.

Google Scholar

Borst, A. (2018). Biophysical mechanism for preferred direction enhancement in fly motion vision. PLoS Comput. Biol. 14:e1006240. doi: 10.1371/journal.pcbi.1006240

PubMed Abstract | CrossRef Full Text | Google Scholar

Mauss, A. S., and Borst, A. (2020). Optic flow-based course control in insects. Curr. Opin. Neurobiol. 60, 21–27. doi: 10.1016/j.conb.2019.10.007

PubMed Abstract | CrossRef Full Text | Google Scholar

Weir, P. T., and Dickinson, M. H. (2015). Functional divisions for visual processing in the central brain of flying Drosophila. Proc. Natl. Acad. Sci. U. S. A. 112, E5523–E5532.

Google Scholar

Su, T. S., Lee, W. J., Huang, Y. C., Wang, C. T., and Lo, C. C. (2017). Coupled symmetric and asymmetric circuits underlying spatial orientation in fruit flies. Nat. Commun. 8:139.

Google Scholar

Posner, M. I. (2016). Orienting of attention: then and now. Q. J. Exp. Psychol. 69, 1864–1875. doi: 10.1080/17470218.2014.937446

PubMed Abstract | CrossRef Full Text | Google Scholar

Netser, S., Zahar, Y., and Gutfreund, Y. (2011). Stimulus-specific adaptation: can it be a neural correlate of behavioral habituation? J. Neurosci. 31, 17811–17820. doi: 10.1523/jneurosci.4790-11.2011

PubMed Abstract | CrossRef Full Text | Google Scholar

Mysore, S. P., and Knudsen, E. I. (2012). Reciprocal inhibition of inhibition: a circuit motif for flexible categorization in stimulus selection. Neuron 73, 193–205. doi: 10.1016/j.neuron.2011.10.037

PubMed Abstract | CrossRef Full Text | Google Scholar

Winkowski, D. E., and Knudsen, E. I. (2008). Distinct mechanisms for top-down control of neural gain and sensitivity in the owl optic tectum. Neuron 60, 698–708. doi: 10.1016/j.neuron.2008.09.013

PubMed Abstract | CrossRef Full Text | Google Scholar

Smith, D. P. (2007). Odor and pheromone detection in Drosophila melanogaster. Pflug. Arch. Eur. J. Physiol. 454, 749–758. doi: 10.1007/s00424-006-0190-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Schneider, J., and Levine, J. D. (2014). Automated identification of social interaction criteria in Drosophila melanogaster. Biol. Lett. 10:20140749. doi: 10.1098/rsbl.2014.0749

PubMed Abstract | CrossRef Full Text | Google Scholar

Zahar, Y., Lev-Ari, T., Wagner, H., and Gutfreund, Y. (2018). Behavioral Evidence and Neural Correlates of Perceptual Grouping by Motion in the Barn Owl. J. Neurosci. 38, 6653–6664. doi: 10.1523/jneurosci.0174-18.2018

PubMed Abstract | CrossRef Full Text | Google Scholar

Mysore, S. P., Asadollahi, A., and Knudsen, E. I. (2011). Signaling of the strongest stimulus in the owl optic tectum. J. Neurosci. 31, 5186–5196. doi: 10.1523/jneurosci.4592-10.2011

PubMed Abstract | CrossRef Full Text | Google Scholar

Payne, R. S. (1971). Acoustic location of prey by barn owls (Tyto alba). J. Exp. Biol. 54, 535–573.

Google Scholar

Agarwal, A., and Gutfreund, Y. (2019). “Space and flight-direction representation in the dorsal pallium of barn owls,” in Neuroscience Meeting Planner, Chicago, Il: Society for Neuroscience.

Google Scholar

Ma, K. Y., Chirarattananon, P., Fuller, S. B., and Wood, R. J. (2013). Controlled flight of a biologically inspired, insect-scale robot. Science 340, 603–607. doi: 10.1126/science.1231806

PubMed Abstract | CrossRef Full Text | Google Scholar

Karásek, M., Muijres, F. T., de Wagter, C., Remes, B. D., and de Croon, G. C. (2018). tailless aerial robotic flapper reveals that flies use torque coupling in rapid banked turns. Science 361, 1089–1094. doi: 10.1126/science.aat0350

PubMed Abstract | CrossRef Full Text | Google Scholar

McGuire, K. N., de Wagter, C., Tuyls, K., Kappen, H. J., and de Croon, G. C. (2019). Minimal navigation solution for a swarm of tiny flying robots to explore an unknown environment. Sci. Rob. 4:eaaw9710. doi: 10.1126/scirobotics.aaw9710

PubMed Abstract | CrossRef Full Text | Google Scholar

Dupeyroux, J., Hagenaars, J., Paredes-Vallés, F., and de Croon, G. C. (2020). Neuromorphic control for optic-flow-based landings of MAVs using the Loihi processor. arXiv preprint arXiv:2011.00534. [preprint]

Google Scholar

Brambilla, M., Ferrante, E., Birattari, M., and Dorigo, M. (2013). Swarm robotics: a review from the swarm engineering perspective. Swarm Intell. 7, 1–41. doi: 10.1007/s11721-012-0075-2

CrossRef Full Text | Google Scholar

Anderson, M. J., Sullivan, J. G., Horiuchi, T. K., Fuller, S. B., and Daniel, T. L. (2020). A bio-hybrid odor-guided autonomous pal-sized air vehicle. Bioinspir. Biomimet. doi: 10.1088/1748-3190/abbd81 [Epub online ahead of print].

PubMed Abstract | CrossRef Full Text | Google Scholar

Katsuki, F., and Constantinidis, C. (2014). Bottom-up and top-down attention: different processes and overlapping neural systems. Neuroscientist 20, 509–521. doi: 10.1177/1073858413514136

PubMed Abstract | CrossRef Full Text | Google Scholar

Fu, Q., Hu, C., Peng, J., Rind, F. C., and Yue, S. (2019). robust collision perception visual neural network with specific selectivity to darker objects. IEEE Trans. Cybernet. 50, 5074–5088. doi: 10.1109/tcyb.2019.2946090

PubMed Abstract | CrossRef Full Text | Google Scholar

Hu, C., Xiong, C., Peng, J., and Yue, S. (2020). Coping With Multiple Visual Motion Cues Under Extremely Constrained Computation Power of Micro Autonomous Robots. IEEE Access 8, 159050–159066. doi: 10.1109/access.2020.3016893

CrossRef Full Text | Google Scholar

Huang, J., Wei, Y., and Krapp, H. (2019). A biohybrid fly-robot interface system that performs active collision avoidance. Bioinspir. Biomimetics. 14:065001. doi: 10.1088/1748-3190/ab3b23

PubMed Abstract | CrossRef Full Text | Google Scholar

Kubilius, J., Schrimpf, M., Kar, K., Hong, H., Majaj, N. J., Rajalingham, R., et al. (2019). Brain-like object recognition with high-performing shallow recurrent ANNs. arXiv preprint arXiv:1909.06161 [preprint].

Google Scholar

Tang, H., Schrimpf, M., Lotter, W., Moerman, C., Paredes, A., Caro, J. O., et al. (2018). Recurrent computations for visual pattern completion. Proc. Natl. Acad. Sci. U. S. A. 115, 8835–8840. doi: 10.1073/pnas.1719397115

PubMed Abstract | CrossRef Full Text | Google Scholar

Roffe, S., Akolkar, H., George, A. D., Linares-Barranco, B., and Benosman, R. (2021). Neutron-Induced, Single-Event Effects on Neuromorphic Event-based Vision Sensor: a First Step Towards Space Applications. arXiv preprint arXiv [preprint]. https://arxiv.org/abs/2102.00112.

Google Scholar

Broadbent, D. E. (1957). Mechanical Model for Human Attention and Immediate Memory. Psychol. Rev. 64, 205–215. doi: 10.1037/h0047313

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: neuromorphic sensing, autonomous flight, bio-inspiration, flying animals, learning, flight control, energy efficiency

Citation: Parlevliet PP, Kanaev A, Hung CP, Schweiger A, Gregory FD, Benosman R, de Croon GCHE, Gutfreund Y, Lo C-C and Moss CF (2021) Autonomous Flying With Neuromorphic Sensing. Front. Neurosci. 15:672161. doi: 10.3389/fnins.2021.672161

Received: 25 February 2021; Accepted: 07 April 2021;
Published: 14 May 2021.

Edited by:

Michael Schmuker, University of Hertfordshire, United Kingdom

Reviewed by:

Cheng Hu, Guangzhou University, China
Shizuko Hiryu, Doshisha University, Japan
Luca Patanè, University of Messina, Italy

Copyright © 2021 Parlevliet, Schweiger, Benosman, de Croon, Gutfreund, Lo and Moss. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Patricia P. Parlevliet, patricia.parlevliet@airbus.com

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.