CORRECTION article

Front. Appl. Math. Stat.

Sec. Optimization

Volume 11 - 2025 | doi: 10.3389/fams.2025.1629658

This article is part of the Research TopicLarge Tensor Analysis and ApplicationsView all 3 articles

Corrigendum: Expectation-maximization alternating least squares for tensor network logistic regression

Provisionally accepted
  • 1Nagoya Institute of Technology, Nagoya, Japan
  • 2RIKEN Center for Advanced Intelligence Project, Tokyo, Japan

The final, formatted version of the article will be published soon.

4 There were errors in the formula in the paper. We had originally derived the correct equations (shown in 5 Section of Derivation of Corrections) and had implemented the computer program correctly. However, 6 mistakes occurred during the paper writing stage, and we had not noticed it until now.We report that the error was caused by a simple typo. We confirmed that only the mathematical formulas 8 (58), ( 59), ( 73) and (74) in the paper were incorrect, and that they were correctly implemented in 9 the computer programs used for the experiments. Therefore, this revision does not change any of the 10 experimental results or the conclusions of the original paper.In the published article, there was an error. Some formulas have typos.Corrections have been made to Equations ( 58), ( 59), ( 73) and (74) in Section 4. These previously stated:14 βm = (Z ⊤ m Ω t Z m ) -1 Z ⊤ m (κ ⊘ ω t ),(58)β(ϵ) m = (Z ⊤ m Ω t Z m + ϵI) -1 Z ⊤ m (κ ⊘ ω t ), (59) βm = (Z ⊤ m Ω t Z m ) -1 Z ⊤ m (κ ⊘ ω t ),(73)β(ϵ) m = (Z ⊤ m Ω t Z m + ϵI) -1 Z ⊤ m (κ ⊘ ω). (74) βm = (Z ⊤ m Ω t Z m ) -1 Z ⊤ m κ,(58)β(ϵ) m = (Z ⊤ m Ω t Z m + ϵI) -1 Z ⊤ m κ, (59) 20 βm = (Z ⊤ m Ω t Z m ) -1 Z ⊤ m κ, (73) 21 β(ϵ) m = (Z ⊤ m Ω t Z m + ϵI) -1 Z ⊤ m κ. (74)The authors apologize for these errors and state that these do not change the scientific conclusions of the 22 article in any way. The original article has been updated.Here we only show the derivation of equation ( 58). The remaining three corrections can be derived in a 25 similar manner.First, equation ( 58) is the weighted least squares solution forg(A m , v|θ (t) ) = ∥Ω 1 2 t (κ ⊘ ω t ) -Ω 1 2 t (L (m) ⊙ Φ (m) ⊙ R (m) ) ⊤ vec(A m ) + vΩ 1 2 t 1∥ 2 2 ,(57)where we put Ω t = diag(ω (t) 1 , ω(t) 2 , ..., ω(t) N ) ∈ R N ×N , ω t = [ω (t) 1 , ω(t) 2 , ..., ω(t) N ] ⊤ ∈ R N , κ = [κ 1 , κ 2 , ..., κ N ] ⊤ ∈ R N ,Z m = [(L (m) ⊙ Φ (m) ⊙ R (m) ) ⊤ , 1], the cost function is rewritten as 30 ∥Ω 1 2 t (κ ⊘ ω t ) -Ω 1 2 t (L (m) ⊙ Φ (m) ⊙ R (m) ) ⊤ vec(A m ) + vΩ 1 2 t 1∥ 2 2 = ∥Ω 1 2 t (κ ⊘ ω t ) -Ω 1 2 t Z m β m ∥ 2 2 = ∥Ω 1 2 t (Z m β m -κ ⊘ ω t )∥ 2 2 = ∥Ω 1 2 t (Z m β m -Ω -1 t κ)∥ 2 2 = (Z m β m -Ω -1 t κ) ⊤ Ω t (Z m β m -Ω -1 t κ) = β ⊤ m Z ⊤ m Ω t Z m β m -2β ⊤ m Z ⊤ m κ + κ ⊤ Ω -1 t κ.Note that κ ⊘ ω t = Ω -1 t κ. Optimality condition is given by ∂g ∂β m βm= βm = 2Z ⊤ m Ω t Z m βm -2Z ⊤ m κ = 0.(1)

Keywords: expectationmaximization (EM), majorizationminimization (MM), Alternating Least Squares (ALS), tensor networks, Tensor train, Logistic regression, Polya-gamma augmentation

Received: 16 May 2025; Accepted: 03 Jun 2025.

Copyright: © 2025 Yamauchi, Hontani and Yokota. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Tatsuya Yokota, Nagoya Institute of Technology, Nagoya, Japan

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.