AUTHOR=Zou Yangkun , Wu Jiande , Ye Bo , Cao Honggui , Feng Jiqi , Wan Zijie , Yin Shaoda TITLE=Infrared and visible image fusion based on multi-scale transform and sparse low-rank representation JOURNAL=Frontiers in Physics VOLUME=Volume 13 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/physics/articles/10.3389/fphy.2025.1514476 DOI=10.3389/fphy.2025.1514476 ISSN=2296-424X ABSTRACT=Infrared and visible image sensors are wildly used and show strong complementary properties, the fusion of infrared and visible images can adapt to a wider range of applications. In order to improve the fusion of infrared and visible images, a novel and effective fusion method is proposed based on multi-scale transform and sparse low-rank representation in this paper. Visible and infrared images are first decomposed to obtain their low-pass and high-pass bands by Laplacian pyramid (LP). Second, low-pass bands are represented with some sparse and low-rank coefficients. In order to improve the computational efficiency and learn a universal dictionary, low-pass bands are separated into several image patches using a sliding window prior to sparse and low rank representation. The low-pass and high-pass bands are then fused by particular fusion rules. The max-absolute rule is used to fuse the high-pass bands, and max-L1 norm rule is utilized to fuse the low-pass bands. Finally, an inverse LP is performed to acquire the fused image. We conduct experiments on three datasets and use 13 metrics to thoroughly and impartially validate our method. The results demonstrate that the proposed fusion framework can effectively preserve the characteristics of source images, and exhibits superior stability across various image pairs and metrics.