A penalized method of alternating projections for weighted low-rank hankel matrix optimization

Jian Shen*, Jein Shan Chen, Hou Duo Qi, Naihua Xiu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Weighted low-rank Hankel matrix optimization has long been used to reconstruct contaminated signal or forecast missing values for time series of a wide class. The Method of Alternating Projections (MAP) (i.e., alternatively projecting to a low-rank matrix manifold and the Hankel matrix subspace) is a leading method. Despite its wide use, MAP has long been criticized of lacking convergence and of ignoring the weights used to reflect importance of the observed data. The most of known results are in a local sense. In particular, the latest research shows that MAP may converge at a linear rate provided that the initial point is close enough to a true solution and a transversality condition is satisfied. In this paper, we propose a globalized variant of MAP through a penalty approach. The proposed method inherits the favourable local properties of MAP and has the same computational complexity. Moreover, it is capable of handling a general weight matrix, is globally convergent, and enjoys local linear convergence rate provided that the cutting off singular values are significantly smaller than the kept ones. Furthermore, the new method also applies to complex data. Extensive numerical experiments demonstrate the efficiency of the proposed method against several popular variants of MAP.

Original languageEnglish
Pages (from-to)417-450
Number of pages34
JournalMathematical Programming Computation
Volume14
Issue number3
DOIs
Publication statusPublished - 2022 Sept

Keywords

  • Alternating projections
  • Global convergence
  • Hankel matrix
  • Linear convergence
  • Time series

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Software

Fingerprint

Dive into the research topics of 'A penalized method of alternating projections for weighted low-rank hankel matrix optimization'. Together they form a unique fingerprint.

Cite this