AUTHOR=Alam Mahabubul , Ghosh Swaroop TITLE=QNet: A Scalable and Noise-Resilient Quantum Neural Network Architecture for Noisy Intermediate-Scale Quantum Computers JOURNAL=Frontiers in Physics VOLUME=Volume 9 - 2021 YEAR=2022 URL=https://www.frontiersin.org/journals/physics/articles/10.3389/fphy.2021.755139 DOI=10.3389/fphy.2021.755139 ISSN=2296-424X ABSTRACT=Quantum machine learning (QML) is promising for potential speedups and improvements inconventional machine learning (ML) tasks (e.g., classification, regression, etc.). Small-scalequantum computers are dominating in the NISQ-era. They are built using various technologiessuch as superconducting qubits, ion-traps, neutral atoms, etc. Existing QML models leveragethese small quantum computers to solve large-scale ML problems using two major approaches: (i)reducing data dimension using a classical algorithm, e.g., Principal Component Analysis or PCA,before applying a QML model, and (ii) repeatedly uploading the high-dimensional classical datain a small number of qubits using sequential rotation operations. Both of the above approachessuffer from large accumulation of gate errors and decoherence. In this article, we present a thirdapproach (QNet) to leverage small and heterogeneous quantum computing systems for QMLapplications. QNet does not require dimension reduction or repeated data uploading. It consists ofseveral small quantum neural networks (QNN) that can be executed on small quantum computers.QNet has 3 major configurable attributes: number of qubits per QNN, trainable parameters, andthe total number of QNN’s in the network. By carefully choosing these attributes, QNet canexploit arbitrary size quantum computers to solve supervised ML tasks of any scale. It alsoenables heterogeneous technology integration in a single QML application. Through empiricalstudies, we show the trainability and generalization of the proposed approach and the impactof various configurable variables on its performance. We compare QNet performance againstexisting models and discuss potential issues and design considerations. In our study, trainedQNet models showed 43% better accuracy on average over the existing models on hardwareemulators. More importantly, QNet provides a blueprint to build noise-resilient QML models witha collection of small quantum neural networks with near-term noisy quantum devices.