Event Abstract

Reducing duplication and redundancy in declarative model specifications

  • 1 Textensor Limited, UK
  • 2 University College London, United Kingdom
  • 3 Arizona State University, United States

Methods for storing and sharing biological models, whether by scripts, or with declarative languages such as SBML or CellML, tend to focus on directly encoding a model and its equations. However, many models share essentially the same equations, except for differences in parameter values or the number of instances of particular processes. We have therefore developed a mechanism within NeuroML[1] whereby the common structural and mathematical features of a family of models can be expressed separately from the parameter values that make up a particular member of the family. Specifications in this form correspond closely to the conceptual understanding that modelers have of the systems they work on. Supporting this style of description allows models to be expressed more concisely than if flatter structures are used since it avoids repeating common elements. It also enables a wide range of model related processing beyond simply executing them, including systematic comparisons, archiving, model composition, and sensitivity analysis. The resulting system, known as LEMS (Low Entropy Model Specification), has been developed to meet the needs of NeuroML but can also be applied to other declarative model specifications. It makes it possible to retro-fit most of the existing high level concepts ion NeuroML version 1 with domain independent definitions built out of the LEMS elements. This preserves the benefits of the existing semantically rich high-level concepts while adding the ability to extend it with new concepts at the user-level rather than requiring changes to the language. Simulator developers have a choice between directly supporting the library of core types in NeuroML or supporting the underlying LEMS definitions. Libraries are available in both Java and Python to allow simulator developers to work with LEMS models without having to implement the low level support themselves. These include options for flattening models into sets of simultaneous differential equations, and for code generation to create standalone executable units. References 1. Gleeson P, Crook S, Cannon RC, Hines ML, Billings GO, et al. NeuroML: A Language for Describing Data Driven Models of Neurons and Networks with a High Degree of Biological Detail. PLoS Comput Biol 6(6): 2010, e1000815. doi:10.1371/journal.pcbi.1000815

Keywords: General neuroinformatics

Conference: 5th INCF Congress of Neuroinformatics, Munich, Germany, 10 Sep - 12 Sep, 2012.

Presentation Type: Poster

Topic: Neuroinformatics

Citation: Cannon R, Gleeson P, Crook S and Silver A (2013). Reducing duplication and redundancy in declarative model specifications. Front. Neuroinform. Conference Abstract: 5th INCF Congress of Neuroinformatics. doi: 10.3389/conf.fninf.2013.08.00008

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 21 Mar 2013; Published Online: 27 Nov 2013.

* Correspondence: Dr. Robert Cannon, Textensor Limited, Edinburgh, UK, robert.c.cannon@gmail.com