Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Approximate 1-Norm Minimization and Minimum-Rank Structured Sparsity for Various Generalized Inverses via Local Search

Published: 01 January 2021 Publication History

Abstract

Fundamental in matrix algebra and its applications, a generalized inverse of a real matrix $A$ is a matrix $H$ that satisfies the Moore--Penrose (M--P) property $AHA=A$. If $H$ also satisfies the additional useful M--P property, $HAH=H$, it is called a reflexive generalized inverse. Reflexivity is equivalent to minimum rank, so we are particularly interested in reflexive generalized inverses. We consider aspects of symmetry related to the calculation of a sparse reflexive generalized inverse of $A$. As is common, and following Fampa and Lee [Oper. Res. Lett., 46 (2018), pp. 605--610] for calculating sparse generalized inverses, we use (vector) 1-norm minimization for inducing sparsity and for keeping the magnitude of entries under control. When $A$ is symmetric, we may naturally desire a symmetric $H$, while generally such a restriction on $H$ may not lead to a 1-norm minimizing reflexive generalized inverse. We investigate a block construction method to produce a symmetric reflexive generalized inverse that is structured and has guaranteed sparsity. We provide a theoretically efficient and practical local-search algorithm to block construct an approximate 1-norm minimizing symmetric reflexive generalized inverse. Another aspect of symmetry that we consider relates to another M--P property: $H$ is ah-symmetric if $AH$ is symmetric. The ah-symmetry property is the key one for solving least-squares problems using $H$. Here we do not assume that $A$ is symmetric, and we do not impose symmetry on $H$. We investigate a column block construction method to produce an ah-symmetric reflexive generalized inverse that is structured and has guaranteed sparsity. We provide a theoretically efficient and practical local-search algorithm to column block construct an approximate 1-norm minimizing ah-symmetric reflexive generalized inverse.

References

[1]
A. Bjerhammar, Application of calculus of matrices to method of least squares with special reference to geodetic calculations, Trans. Roy. Inst. Tech. Stockholm, 1951 (1951), no. 49.
[2]
S. L. Campbell and C. D. Meyer, Generalized Inverses of Linear Transformations, SIAM, Philadelphia, 2009, https://doi.org/10.1137/1.9780898719048.
[3]
I. Dokmanić and R. Gribonval, Beyond Moore-Penrose Part I: Generalized Inverses that Minimize Matrix Norms, preprint, http://arxiv.org/abs/1706.08349, 2017.
[4]
I. Dokmanić and R. Gribonval, Beyond Moore-Penrose Part II: The Sparse Pseudoinverse, preprint, https://arxiv.org/abs/1706.08701, 2017.
[5]
I. Dokmanić, M. Kolundžija, and M. Vetterli, Beyond Moore-Penrose: Sparse pseudoinverse, in Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2013, pp. 6526--6530.
[6]
A. Dresden, The fourteenth western meeting of the American Mathematical Society, Bull. Amer. Math. Soc., 26 (1920), pp. 385--396.
[7]
M. Fampa and J. Lee, On sparse reflexive generalized inverses, Oper. Res. Lett., 46 (2018), pp. 605--610.
[8]
M. Fampa, J. Lee, G. Ponte, and L. Xu, Experimental Analysis of Local Search for Sparse Reflexive Generalized Inverses, preprint, https://arxiv.org/abs/2001.03732, 2020.
[9]
V. Fuentes, M. Fampa, and J. Lee, Sparse pseudoinverses via LP and SDP relaxations of Moore-Penrose, in Proceedings of the XVIII Latin-Iberoamerican Conference on Operations Research (CLAIO), 2016, pp. 343--350.
[10]
V. K. Fuentes, M. Fampa, and J. Lee, Diving for sparse partially-reflexive generalized inverses, in Optimization of Complex Systems: Theory, Models, Algorithms and Applications, H. A. Le Thi, H. M. Le, and T. Pham Dinh, eds., Springer, Cham, 2020, pp. 89--98.
[11]
T. Gkountouvas, V. Karakasis, K. Kourtis, G. Goumas, and N. Koziris, Improving the performance of the symmetric sparse matrix-vector multiplication in multicore, in Proceedings of the 27th IEEE International Symposium on Parallel and Distributed Processing, 2013, pp. 273--283.
[12]
G. Golub and C. Van Loan, Matrix Computations, 3rd ed., Johns Hopkins University Press, Baltimore, MD, 1996.
[13]
B. K. Natarajan, Sparse approximate solutions to linear systems, SIAM J. Comput., 24 (1995), pp. 227--234, https://doi.org/10.1137/S0097539792240406.
[14]
R. Penrose, A generalized inverse for matrices, Proc. Cambridge Philos. Soc., 51 (1955), pp. 406--413.
[15]
C. Rao and S. Mitra, Generalized Inverse of Matrices and Its Applications, John Wiley & Sons, New York, London, Sydney, 1971.
[16]
C. Rohde, Contributions to the Theory, Computation and Application of Generalized Inverses, Ph.D. thesis, University of North Carolina, Raleigh, NC, 1964, https://repository.lib.ncsu.edu/handle/1840.4/2316?show=full.
[17]
K. Schacke, On the Kronecker Product, Master's thesis, University of Waterloo, Waterloo, ON, Canada, 2004.
[18]
D. Williamson and D. Shmoys, The Design of Approximation Algorithms, 1st ed., Cambridge University Press, New York, 2011.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image SIAM Journal on Optimization
SIAM Journal on Optimization  Volume 31, Issue 3
DOI:10.1137/sjope8.31.3
Issue’s Table of Contents

Publisher

Society for Industrial and Applied Mathematics

United States

Publication History

Published: 01 January 2021

Author Tags

  1. generalized inverse
  2. sparse optimization
  3. approximation algorithm

Author Tags

  1. Primary
  2. 90C26
  3. 90C25; Secondary
  4. 15A09
  5. 65K05

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 10 Dec 2024

Other Metrics

Citations

View Options

View options

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media