TY - JOUR
AU - Kliesch, Martin
AU - Szarek, Stanislaw J.
AU - Jung, Peter
PY - 2019
M3 - 10.3389/fams.2019.00023
SP - 23
TI - Simultaneous Structures in Convex Signal Recovery—Revisiting the Convex Combination of Norms
JO - Frontiers in Applied Mathematics and Statistics
UR - https://www.frontiersin.org/article/10.3389/fams.2019.00023
VL - 5
SN - 2297-4687
N2 - In compressed sensing one uses known structures of otherwise unknown signals to recover them from as few linear observations as possible. The structure comes in form of some compressibility including different notions of sparsity and low rankness. In many cases convex relaxations allow to efficiently solve the inverse problems using standard convex solvers at almost-optimal sampling rates. A standard practice to account for multiple simultaneous structures in convex optimization is to add further regularizers or constraints. From the compressed sensing perspective there is then the hope to also improve the sampling rate. Unfortunately, when taking simple combinations of regularizers, this seems not to be automatically the case as it has been shown for several examples in recent works. Here, we give an overview over ideas of combining multiple structures in convex programs by taking weighted sums and weighted maximums. We discuss explicitly cases where optimal weights are used reflecting an optimal tuning of the reconstruction. In particular, we extend known lower bounds on the number of required measurements to the optimally weighted maximum by using geometric arguments. As examples, we discuss simultaneously low rank and sparse matrices and notions of matrix norms (in the “square deal” sense) for regularizing for tensor products. We state an SDP formulation for numerically estimating the statistical dimensions and find a tensor case where the lower bound is roughly met up to a factor of two.
ER -