Sparsity-cognizant total least-squares for perturbed compressive sampling
IEEE Transactions on Signal Processing, 2011•ieeexplore.ieee.org
Solving linear regression problems based on the total least-squares (TLS) criterion has well-
documented merits in various applications, where perturbations appear both in the data
vector as well as in the regression matrix. However, existing TLS approaches do not account
for sparsity possibly present in the unknown vector of regression coefficients. On the other
hand, sparsity is the key attribute exploited by modern compressive sampling and variable
selection approaches to linear regression, which include noise in the data, but do not …
documented merits in various applications, where perturbations appear both in the data
vector as well as in the regression matrix. However, existing TLS approaches do not account
for sparsity possibly present in the unknown vector of regression coefficients. On the other
hand, sparsity is the key attribute exploited by modern compressive sampling and variable
selection approaches to linear regression, which include noise in the data, but do not …
Solving linear regression problems based on the total least-squares (TLS) criterion has well-documented merits in various applications, where perturbations appear both in the data vector as well as in the regression matrix. However, existing TLS approaches do not account for sparsity possibly present in the unknown vector of regression coefficients. On the other hand, sparsity is the key attribute exploited by modern compressive sampling and variable selection approaches to linear regression, which include noise in the data, but do not account for perturbations in the regression matrix. The present paper fills this gap by formulating and solving (regularized) TLS optimization problems under sparsity constraints. Near-optimum and reduced-complexity suboptimum sparse (S-) TLS algorithms are developed to address the perturbed compressive sampling (and the related dictionary learning) challenge, when there is a mismatch between the true and adopted bases over which the unknown vector is sparse. The novel S-TLS schemes also allow for perturbations in the regression matrix of the least-absolute selection and shrinkage selection operator (Lasso), and endow TLS approaches with ability to cope with sparse, under-determined “errors-in-variables” models. Interesting generalizations can further exploit prior knowledge on the perturbations to obtain novel weighted and structured S-TLS solvers. Analysis and simulations demonstrate the practical impact of S-TLS in calibrating the mismatch effects of contemporary grid-based approaches to cognitive radio sensing, and robust direction-of-arrival estimation using antenna arrays.
ieeexplore.ieee.org