Skip to main content

ORIGINAL RESEARCH article

Front. Big Data
Sec. Machine Learning and Artificial Intelligence
Volume 7 - 2024 | doi: 10.3389/fdata.2024.1382144

Efficient Enhancement of Low-Rank Tensor Completion via Thin QR Decomposition Provisionally Accepted

  • 1Yunnan Key Laboratory of Statistical Modeling and Data Analysis, Yunnan University, China

The final, formatted version of the article will be published soon.

Receive an email when it is updated
You just subscribed to receive the final version of the article

Low-rank tensor completion (LRTC), which aims to complete missing entries from tensors with partially observed terms by utilizing the low-rank structure of tensors, has been widely used in various real-world issues. The core tensor nuclear norm minimization (CTNM) method based on Tucker decomposition is one of common LRTC methods. However, the CTNM methods based on Tucker decomposition often have a large computing cost due to the fact that the general factor matrix solving technique involves multiple singular value decompositions (SVDs) in each loop. To address this problem, this article enhances the method and proposes an effective CTNM method based on thin QR decomposition (CTNM-QR) with lower computing complexity. The proposed method extends the CTNM by introducing tensor versions of the auxiliary variables instead of matrices, while using the thin QR decomposition to solve the factor matrix rather than the SVD, which can save the computational complexity and improve the tensor completion accuracy. In addition, the CTNM-QR method's convergence and complexity are analyzed further. Numerous experiments in synthetic data, real color images, and brain MRI data at different missing rates demonstrate that the proposed method not only outperforms in terms of completion accuracy and visualization, but also conducts more efficiently than most state-of-the-art LRTC methods.

Keywords: Auxiliary variable tensor, Tensor nuclear norm minimization, Thin QR decomposition, Tucker decomposition, Tucker rank

Received: 05 Feb 2024; Accepted: 30 Apr 2024.

Copyright: © 2024 Wu and Jin. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Dr. Yunzhi Jin, Yunnan Key Laboratory of Statistical Modeling and Data Analysis, Yunnan University, Kunming, Yunnan Province, China