
Frontiers in Science Lead Article
Published on 16 Dec 2025
Breaking the memory wall: next-generation artificial intelligence hardware
- 944 views


Frontiers in Science Lead Article
Published on 16 Dec 2025

Join Prof Kaushik Roy (Purdue University, USA) and colleagues for a complimentary virtual symposium on next steps for neuromorphic computing.

Prof R. Stanley Williams, Texas A&M University, USA—AI’s unsustainable energy and infrastructure requirements demand unified, brain-inspired architectures that combine memory and computation to eliminate costly data-movement bottlenecks unresolvable by software-level advances alone.

Dr Vilas Dhar, Patrick J. McGovern Foundation, USA—During this brief technological transition, policymakers must embed equitable access and accountable governance into AI, advancing the global public good rather than entrenching current political and economic inequalities.
Efficient artificial intelligence (AI) hardware is crucial for resource-constrained applications such as healthcare and transportation, where it enhances performance, reduces costs, and supports real-time decision-making.
Overcoming the memory wall in traditional hardware is critical for enhancing AI computational efficiency, reducing latency, and enabling faster, more effective processing of complex algorithms.
Compute-in-memory (CIM) paradigms using different memory technologies, such as embedded non-volatile memory (eNVM), static random-access memory (SRAM), dynamic random-access memory (DRAM), and flash memory, help develop energy-efficient AI hardware by tackling the memory wall problem.
Stochasticity in AI algorithms (e.g., via spike timing-dependent plasticity or STDP) and hardware (e.g., via spin–orbit transfer magnetic tunnel junctions or SOT-MTJs) can be leveraged to improve energy efficiency for diverse workloads and could unlock novel capabilities.
Co-designing hardware and algorithms to optimize energy, latency, and accuracy will lead to the development of a “converged platform” for artificial neural networks (ANNs) and spiking neural networks (SNNs), suitable for diverse AI applications.

A summary of the lead article in a Q&A format, with a figure and video.

A version of the lead article written for—and peer reviewed by—kids aged 8-15 years.
Follow the science, follow Frontiers in Science