Event Abstract

A Model of Visual Route Navigation in Ants Without Waypoints

  • 1 University of Sussex, Informatics, United Kingdom
  • 2 University of Sussex, Life Sciences, United Kingdom

The impressive ability of social insects to learn long foraging routes guided by visual information provides proof that robust spatial behaviour can be produced with limited neural resources. As such, social insects have become an important model system for understanding the minimal cognitive requirements for navigation. Models of visual navigation that have been successful in replicating place homing are dominated by snapshot-type models where a single view of the world as memorized from a discrete location, or waypoint, usually the goal location, is compared to the current view in order to drive a search for the goal. Snapshot approaches only allow for navigation in the immediate vicinity of the goal however, and to navigate longer distances the usual approach is to divide the routes in multiple discrete waypoints and use local navigation to travel between each one in turn. However, there are issues with this approach and for robust navigation, an additional place recognition mechanism is needed.

Here we present a parsimonious model of visually guided route learning that achieves robust long-range route navigation without dividing the world into discrete waypoints (Baddeley et al., 2012). We test this proposed route navigation strategy in simulation, by learning a series of routes through visually cluttered environments consisting of objects that are only distinguishable as silhouettes against the sky. Our navigation algorithm consists of two phases. The ant first traverses the route using a combination of path integration and obstacle avoidance during which the views used to learn the route are experienced. Subsequently, the ant navigates by visually scanning the world – a behaviour observed in ants in the field – and moving in the direction which is deemed most familiar. As proof of concept, we first determine view familiarity by exhaustive comparison with the set of views experienced during training. In subsequent experiments we train an artificial neural network to perform familiarity discrimination using the training views via the InfoMax algorithm.

By utilising the interaction of sensori-motor constraints and observed innate behaviours we show that it is possible to produce robust behaviour using a learnt holistic representation of a route. Furthermore, we show that the model captures the known properties of route navigation in desert ants. These include the ability to learn a route after a single training run and the ability to learn multiple idiosyncratic routes to a single goal. Importantly, navigation is independent of odometric or compass information, does not specify when or what to learn nor separate the routes into sequences of waypoints, so providing proof of concept that route navigation can be achieved without these elements. The algorithm also exhibits both place-search and route navigation with the same mechanism. As such, we believe the model represents the only detailed and complete model of insect route guidance to date.

Acknowledgements

This work was supported by EPSRC grant EP/I031758/1

References

Baddeley B, Graham P, Husbands P, Philippides A: A Model of Ant Route Navigation Driven by Scene Familiarity. PLoS Comput Biol 2012, 8(1): e1002336

Keywords: ant, insect navigation, modelling, Visual homing

Conference: Tenth International Congress of Neuroethology, College Park. Maryland USA, United States, 5 Aug - 10 Aug, 2012.

Presentation Type: Poster (but consider for Participant Symposium)

Topic: Orientation and Navigation

Citation: Baddeley B, Graham P, Husbands P and Philippides A (2012). A Model of Visual Route Navigation in Ants Without Waypoints. Conference Abstract: Tenth International Congress of Neuroethology. doi: 10.3389/conf.fnbeh.2012.27.00261

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 30 Apr 2012; Published Online: 07 Jul 2012.

* Correspondence: Dr. Andrew Philippides, University of Sussex, Informatics, Brighton, Sussex, BN19QJ, United Kingdom, andrewop@sussex.ac.uk