Research Topic

Deep Learning in Neuromorphic Systems

About this Research Topic

Deep learning featured among the 2013 top 10 technology breakthroughs in the MIT Technology Review. Currently it is an active research area for academic, industrial and government labs. Deep learning neural networks have enabled state-of-the-art machine learning results to approach human level performance in difficult machine perception tasks such as speech, language, and visual perception. Given the prominence of this topic in scholarly pursuit, and its overlap with many parts of neuromorphic engineering, it is ideally suited and timely for a Research Topic, one that highlights the technical strengths of the neuromorphic community. We will focus on hardware and neuromorphic accelerators, which will specifically leverage our strengths while making this Research Topic highly relevant for Frontiers in Neuromorphic Engineering.

The Research Topic will be focussed on various implementation aspects for deep learning architectures. One particular interest is efficient on-line learning algorithms implemented in hardware. State-of-the-art deep convolutional neural networks have achieved remarkable performances by using the back-propagation algorithm to fine tune the weights of the connections between layers. The storage of billions of meticulously tuned weights (as done in state-of-the-art deep neural networks) is the main bottleneck for hardware implementation of deep neural networks. More importantly, the training of the deep learning neural networks is quite time consuming. Hence efficient online-learning algorithms are vital for the deployment of deep learning on dedicated hardware. Papers on this aspect of deep learning will be prioritised. Furthermore, most of the widely used deep learning models are convolutional neural networks, deep belief networks, and deep networks with stacked auto-encoders. We will also invite papers that realised such algorithms as custom hardware using state-of-the-art IC design techniques, where the power, speed and complexity are optimised for these applications. Lastly, since typical deep learning architectures are inspired by biology, while not exactly being neurophysiologically plausible, we will also encourage papers that attempt to close the loop between biology, algorithms, and hardware.

Relevant topics include:
• Non-spiking hardware implementations of deep learning (analogue, digital, mixed-signal implementations);
• Spiking hardware implementations of deep learning (analogue, digital, mixed-signal implementations);
• Neuromorphic sensors combined with deep learning neural networks;
• Hardware models of biology inspired learning, such as learning with Cortical Circuits.

The resulting collection of original research articles, reviews and commentaries will be a reference for deep learning in neuromorphic systems, fostering the research progress through discussions and new collaborations among the different researchers in our community.


Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

Deep learning featured among the 2013 top 10 technology breakthroughs in the MIT Technology Review. Currently it is an active research area for academic, industrial and government labs. Deep learning neural networks have enabled state-of-the-art machine learning results to approach human level performance in difficult machine perception tasks such as speech, language, and visual perception. Given the prominence of this topic in scholarly pursuit, and its overlap with many parts of neuromorphic engineering, it is ideally suited and timely for a Research Topic, one that highlights the technical strengths of the neuromorphic community. We will focus on hardware and neuromorphic accelerators, which will specifically leverage our strengths while making this Research Topic highly relevant for Frontiers in Neuromorphic Engineering.

The Research Topic will be focussed on various implementation aspects for deep learning architectures. One particular interest is efficient on-line learning algorithms implemented in hardware. State-of-the-art deep convolutional neural networks have achieved remarkable performances by using the back-propagation algorithm to fine tune the weights of the connections between layers. The storage of billions of meticulously tuned weights (as done in state-of-the-art deep neural networks) is the main bottleneck for hardware implementation of deep neural networks. More importantly, the training of the deep learning neural networks is quite time consuming. Hence efficient online-learning algorithms are vital for the deployment of deep learning on dedicated hardware. Papers on this aspect of deep learning will be prioritised. Furthermore, most of the widely used deep learning models are convolutional neural networks, deep belief networks, and deep networks with stacked auto-encoders. We will also invite papers that realised such algorithms as custom hardware using state-of-the-art IC design techniques, where the power, speed and complexity are optimised for these applications. Lastly, since typical deep learning architectures are inspired by biology, while not exactly being neurophysiologically plausible, we will also encourage papers that attempt to close the loop between biology, algorithms, and hardware.

Relevant topics include:
• Non-spiking hardware implementations of deep learning (analogue, digital, mixed-signal implementations);
• Spiking hardware implementations of deep learning (analogue, digital, mixed-signal implementations);
• Neuromorphic sensors combined with deep learning neural networks;
• Hardware models of biology inspired learning, such as learning with Cortical Circuits.

The resulting collection of original research articles, reviews and commentaries will be a reference for deep learning in neuromorphic systems, fostering the research progress through discussions and new collaborations among the different researchers in our community.


Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

About Frontiers Research Topics

With their unique mixes of varied contributions from Original Research to Review Articles, Research Topics unify the most influential researchers, the latest key findings and historical advances in a hot research area! Find out more on how to host your own Frontiers Research Topic or contribute to one as an author.

Topic Editors

Loading..

Submission Deadlines

31 January 2018 Abstract
31 May 2018 Manuscript

Participating Journals

Manuscripts can be submitted to this Research Topic via the following journals:

Loading..

Topic Editors

Loading..

Submission Deadlines

31 January 2018 Abstract
31 May 2018 Manuscript

Participating Journals

Manuscripts can be submitted to this Research Topic via the following journals:

Loading..
Loading..

total views article views article downloads topic views

}
 
Top countries
Top referring sites
Loading..

Comments

Loading..

Add a comment

Add comment
Back to top