Loading [MathJax]/extensions/tex2jax.js

Low-Rank Matrix Regression via Least-Angle Regression

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Autorschaft

Organisationseinheiten

Details

OriginalspracheEnglisch
Seiten (von - bis)637-642
Seitenumfang6
FachzeitschriftIEEE Control Systems Letters
Jahrgang9
Frühes Online-Datum5 Juni 2025
PublikationsstatusVeröffentlicht - 2025

Abstract

Low-rank matrix regression is a fundamental problem in data science with various applications in systems and control. Nuclear norm regularization has been widely applied to solve this problem due to its convexity. However, it suffers from high computational complexity and the inability to directly specify the rank. This work introduces a novel framework for low-rank matrix regression that addresses both unstructured and Hankel matrices. By decomposing the low-rank matrix into rank-1 bases, the problem is reformulated as an infinite-dimensional sparse learning problem. The least-angle regression (LAR) algorithm is then employed to solve this problem efficiently. For unstructured matrices, a closed-form LAR solution is derived with equivalence to a normalized nuclear norm regularization problem. For Hankel matrices, a real-valued polynomial basis reformulation enables effective LAR implementation. Two numerical examples in network modeling and system realization demonstrate that the proposed approach significantly outperforms the nuclear norm method in terms of estimation accuracy and computational efficiency.

ASJC Scopus Sachgebiete

Zitieren

Low-Rank Matrix Regression via Least-Angle Regression. / Yin, Mingzhou; Muller, Matthias A.
in: IEEE Control Systems Letters, Jahrgang 9, 2025, S. 637-642.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Yin M, Muller MA. Low-Rank Matrix Regression via Least-Angle Regression. IEEE Control Systems Letters. 2025;9:637-642. Epub 2025 Jun 5. doi: 10.1109/LCSYS.2025.3577081, 10.48550/arXiv.2503.10569
Yin, Mingzhou ; Muller, Matthias A. / Low-Rank Matrix Regression via Least-Angle Regression. in: IEEE Control Systems Letters. 2025 ; Jahrgang 9. S. 637-642.
Download
@article{750d4471c36d4ed2acc3d9d4639e01c6,
title = "Low-Rank Matrix Regression via Least-Angle Regression",
abstract = "Low-rank matrix regression is a fundamental problem in data science with various applications in systems and control. Nuclear norm regularization has been widely applied to solve this problem due to its convexity. However, it suffers from high computational complexity and the inability to directly specify the rank. This work introduces a novel framework for low-rank matrix regression that addresses both unstructured and Hankel matrices. By decomposing the low-rank matrix into rank-1 bases, the problem is reformulated as an infinite-dimensional sparse learning problem. The least-angle regression (LAR) algorithm is then employed to solve this problem efficiently. For unstructured matrices, a closed-form LAR solution is derived with equivalence to a normalized nuclear norm regularization problem. For Hankel matrices, a real-valued polynomial basis reformulation enables effective LAR implementation. Two numerical examples in network modeling and system realization demonstrate that the proposed approach significantly outperforms the nuclear norm method in terms of estimation accuracy and computational efficiency.",
keywords = "identification, Low-rank approximation, model reduction, subspace methods",
author = "Mingzhou Yin and Muller, {Matthias A.}",
note = "Publisher Copyright: {\textcopyright} 2017 IEEE.",
year = "2025",
doi = "10.1109/LCSYS.2025.3577081",
language = "English",
volume = "9",
pages = "637--642",

}

Download

TY - JOUR

T1 - Low-Rank Matrix Regression via Least-Angle Regression

AU - Yin, Mingzhou

AU - Muller, Matthias A.

N1 - Publisher Copyright: © 2017 IEEE.

PY - 2025

Y1 - 2025

N2 - Low-rank matrix regression is a fundamental problem in data science with various applications in systems and control. Nuclear norm regularization has been widely applied to solve this problem due to its convexity. However, it suffers from high computational complexity and the inability to directly specify the rank. This work introduces a novel framework for low-rank matrix regression that addresses both unstructured and Hankel matrices. By decomposing the low-rank matrix into rank-1 bases, the problem is reformulated as an infinite-dimensional sparse learning problem. The least-angle regression (LAR) algorithm is then employed to solve this problem efficiently. For unstructured matrices, a closed-form LAR solution is derived with equivalence to a normalized nuclear norm regularization problem. For Hankel matrices, a real-valued polynomial basis reformulation enables effective LAR implementation. Two numerical examples in network modeling and system realization demonstrate that the proposed approach significantly outperforms the nuclear norm method in terms of estimation accuracy and computational efficiency.

AB - Low-rank matrix regression is a fundamental problem in data science with various applications in systems and control. Nuclear norm regularization has been widely applied to solve this problem due to its convexity. However, it suffers from high computational complexity and the inability to directly specify the rank. This work introduces a novel framework for low-rank matrix regression that addresses both unstructured and Hankel matrices. By decomposing the low-rank matrix into rank-1 bases, the problem is reformulated as an infinite-dimensional sparse learning problem. The least-angle regression (LAR) algorithm is then employed to solve this problem efficiently. For unstructured matrices, a closed-form LAR solution is derived with equivalence to a normalized nuclear norm regularization problem. For Hankel matrices, a real-valued polynomial basis reformulation enables effective LAR implementation. Two numerical examples in network modeling and system realization demonstrate that the proposed approach significantly outperforms the nuclear norm method in terms of estimation accuracy and computational efficiency.

KW - identification

KW - Low-rank approximation

KW - model reduction

KW - subspace methods

UR - http://www.scopus.com/inward/record.url?scp=105007601918&partnerID=8YFLogxK

U2 - 10.1109/LCSYS.2025.3577081

DO - 10.1109/LCSYS.2025.3577081

M3 - Article

AN - SCOPUS:105007601918

VL - 9

SP - 637

EP - 642

JO - IEEE Control Systems Letters

JF - IEEE Control Systems Letters

ER -

Von denselben Autoren