Analysis of data with high dimensionality in modern applications, such as spectral analysis, neur... more Analysis of data with high dimensionality in modern applications, such as spectral analysis, neuroscience, chemometrices naturally requires tensorial approaches different from standard matrix factorizations (PCA, ICA, NMF). The Tucker decomposition and its constrained versions with sparsity and/or nonnegativity constraints allow for the extraction of different numbers of hidden factors in each of the modes, and permits interactions within each modality
In this paper, we propose new and efficient algorithms for nonnegative Tucker decomposition (NTD)... more In this paper, we propose new and efficient algorithms for nonnegative Tucker decomposition (NTD): Fast α-NTD algorithm which is much precise and faster than α-NTD [1]; and β-NTD algorithm based on the β divergence. These new algorithms include efficient normalization and initialization steps which help to reduce considerably the running time and increase dramatically the performance. Moreover, the multilevel NTD
2008 IEEE Workshop on Machine Learning for Signal Processing, 2008
In this paper we propose a new iterative thresholding algorithm for distributed compressed sensin... more In this paper we propose a new iterative thresholding algorithm for distributed compressed sensing (CS) based on a set of local cost functions referred as HALS-CS algorithm (compare with). This algorithm allows reconstructing all sources simultaneously by processing row by row of the compressed signals. Moreover, with an adaptive nonlinearly decreasing thresholding strategy, we are able to reconstruct almost perfectly
Parallel factor analysis (PARAFAC) is a multi-way decomposition method which allows to find hidde... more Parallel factor analysis (PARAFAC) is a multi-way decomposition method which allows to find hidden factors from the raw tensor data with many potential applications in neuroscience, bioinformatics, chemometrics etc [1,2]. The Alternating Least Squares (ALS) algorithm can explain the raw tensor by a small number of rank-one tensors with a high fitness. However, for large scale data, due to necessity
2011 IEEE Statistical Signal Processing Workshop (SSP), 2011
Algorithmsbased on alternating optimization for nonnegative Tucker decompositions (NTD) such as A... more Algorithmsbased on alternating optimization for nonnegative Tucker decompositions (NTD) such as ALS, multiplicative least squares, HALS have been confirmed effective and efficient. However, those algorithms often converge very slowly. To this end, we propose a novel algorithm for NTD using the Levenberg-Marquardt technique with fast computation method to construct the approximate Hessian and gradient without building up the large-scale Jacobian.
2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2014
ABSTRACT We propose algorithms for Tucker tensor decomposition, which can avoid computing singula... more ABSTRACT We propose algorithms for Tucker tensor decomposition, which can avoid computing singular value decomposition or eigenvalue decomposition of large matrices as in the work-horse higher order orthogonal iteration (HOOI) algorithm. The novel algorithms require computational cost of O(I3R), which is cheaper than O(I3R + IR4 + R6) of HOOI for multilinear rank-(R, R, R) tensors of size I × I × I.
Analysis of data with high dimensionality in modern applications, such as spectral analysis, neur... more Analysis of data with high dimensionality in modern applications, such as spectral analysis, neuroscience, chemometrices naturally requires tensorial approaches different from standard matrix factorizations (PCA, ICA, NMF). The Tucker decomposition and its constrained versions with sparsity and/or nonnegativity constraints allow for the extraction of different numbers of hidden factors in each of the modes, and permits interactions within each modality
In this paper, we propose new and efficient algorithms for nonnegative Tucker decomposition (NTD)... more In this paper, we propose new and efficient algorithms for nonnegative Tucker decomposition (NTD): Fast α-NTD algorithm which is much precise and faster than α-NTD [1]; and β-NTD algorithm based on the β divergence. These new algorithms include efficient normalization and initialization steps which help to reduce considerably the running time and increase dramatically the performance. Moreover, the multilevel NTD
2008 IEEE Workshop on Machine Learning for Signal Processing, 2008
In this paper we propose a new iterative thresholding algorithm for distributed compressed sensin... more In this paper we propose a new iterative thresholding algorithm for distributed compressed sensing (CS) based on a set of local cost functions referred as HALS-CS algorithm (compare with). This algorithm allows reconstructing all sources simultaneously by processing row by row of the compressed signals. Moreover, with an adaptive nonlinearly decreasing thresholding strategy, we are able to reconstruct almost perfectly
Parallel factor analysis (PARAFAC) is a multi-way decomposition method which allows to find hidde... more Parallel factor analysis (PARAFAC) is a multi-way decomposition method which allows to find hidden factors from the raw tensor data with many potential applications in neuroscience, bioinformatics, chemometrics etc [1,2]. The Alternating Least Squares (ALS) algorithm can explain the raw tensor by a small number of rank-one tensors with a high fitness. However, for large scale data, due to necessity
2011 IEEE Statistical Signal Processing Workshop (SSP), 2011
Algorithmsbased on alternating optimization for nonnegative Tucker decompositions (NTD) such as A... more Algorithmsbased on alternating optimization for nonnegative Tucker decompositions (NTD) such as ALS, multiplicative least squares, HALS have been confirmed effective and efficient. However, those algorithms often converge very slowly. To this end, we propose a novel algorithm for NTD using the Levenberg-Marquardt technique with fast computation method to construct the approximate Hessian and gradient without building up the large-scale Jacobian.
2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2014
ABSTRACT We propose algorithms for Tucker tensor decomposition, which can avoid computing singula... more ABSTRACT We propose algorithms for Tucker tensor decomposition, which can avoid computing singular value decomposition or eigenvalue decomposition of large matrices as in the work-horse higher order orthogonal iteration (HOOI) algorithm. The novel algorithms require computational cost of O(I3R), which is cheaper than O(I3R + IR4 + R6) of HOOI for multilinear rank-(R, R, R) tensors of size I × I × I.
Uploads
Papers by Anh Phan